Skip to content

mhrice/BeatTrack

Repository files navigation

TCN-based Joint Beat and Downbeat Tracking

A pytorch-lightning implementation of TCN-based Joint Beat and Downbeat Tracking model described in the paper Temporal convolutional networks for musical audio beat tracking, with some extra ideas from Deconstruct, Analyse, Reconstruct: How to improve Tempo, Beat, and Downbeat Estimation.

Setup

Install Dependencies

  1. git clone https://github.com/mhrice/BeatTrack.git
  2. cd BeatTrack
  3. python3 -m venv env
  4. source env/bin/activate
  5. pip install cython numpy
  6. pip install -e .

Need a 2 stage pip install because of madmom issues

Fix Other Madmom issue

In newer versions of python, madmom has an issue with the processors.py file. To fix this, run the following command, replacing {python-version} with your python version: cp processors.py env/lib/{python-version}/site-packages/madmom/processors.py

Download Ballroom Dataset (not needed for inference)

  1. mkdir data && cd data
  2. wget http://mtg.upf.edu/ismir2004/contest/tempoContest/data1.tar.gz
  3. git clone https://github.com/CPJKU/BallroomAnnotations.git
  4. tar -xvzf data1.tar.gz
  5. rm data1.tar.gz

Training

python scripts/train.py Important: Change wandb logger settings to your own account.

Inference

python scripts/inference.py audio_file_path {checkpoint_path}

or

from scripts.inference import beatTracker
beatTracker(audio_file_path)

Parameters

See cfg.py for all parameters.

Checkpoints

checkpoints/best.ckpt is a checkpoint where the model was trained to do joint beat/downbeat predictions on the Ballroom dataset for 165 epochs.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published