2023/08/30 We extend I-MuPPET to 3D. We show that 3D-MuPPET also works in natural environments without model fine-tuning on additional annotations. [arXiv]
2023/05/31 Annotated single pigeon data to reproduce the results of the paper is available here.
2023/05/31 Benchmark on 2D multi-pigeon tracking is available here.
This repository provides code for I-MuPPET (GCPR 2022, oral).
Abstract
Most tracking data encompasses humans, the availability of annotated tracking data for animals is limited, especially for multiple objects. To overcome this obstacle, we present I-MuPPET, a system to estimate and track 2D keypoints of multiple pigeons at interactive speed. We train a Keypoint R-CNN on single pigeons in a fully supervised manner and infer keypoints and bounding boxes of multiple pigeons with that neural network. We use a state of the art tracker to track the individual pigeons in video sequences. I-MuPPET is tested quantitatively on single pigeon motion capture data, and we achieve comparable accuracy to state of the art 2D animal pose estimation methods in terms of Root Mean Square Error (RMSE). Additionally, we test I-MuPPET to estimate and track poses of multiple pigeons in video sequences with up to four pigeons and obtain stable and accurate results with up to 17 fps. To establish a baseline for future research, we perform a detailed quantitative tracking evaluation, which yields encouraging results.
If you find a bug, have a question or know how to improve the code, please open an issue.
Set up a conda environment with conda env create -f environment.yml
.
Multi-pigeon video sequences from the project page
Our multi-pigeon video sequences from the project page can be downloaded here. Unzip and copy the "videos" folder to ./data/
. You can use these video sequences to run I-MuPPET with pre-trained weights that we provide.
Labeled single pigeon data
Our annotated single pigeon data can be downloaded here. Unzip and copy the "pigeon_data" folder to ./data/annotations/
.
Multi-pigeon video sequences with ground truth for the quantitative tracking evaluation
Our multi-pigeon video sequences with ground truth for the quantitative tracking evaluation can be downloaded here. Unzip and copy the "data" folder to ./
.
Odor trail tracking video sequence (mouse) from DeepLabCut
A video sequence of the odor trail tracking data from DeepLabCut can be found here. Download the video and copy it to ./data/videos/
.
Odor trail tracking data (mice) from DeepLabCut preprocessed for I-MuPPET
We also provide odor trail tracking data from DeepLabCut that we preprocessed. You can download this data here. Unzip and copy the "dlc_data" folder to ./data/annotations/
. Use this data to train I-MuPPET for mice.
Cowbird data from "3D Bird Reconstruction"
"3D Bird Reconstruction" provides a cowbird data set. You can find it here. Download and copy to ./data/annotations/
. Use this data to train I-MuPPET for cowbirds.
Pre-trained weights for pigeons can be downloaded here, while for cowbirds and mice you find the pre-trained weights here. Unzip and copy the "weights" folder to ./data/
. You can use these pre-trained weights e.g. to run I-MuPPET on the multi-pigeon video sequences that we provide.
Preliminary task
Clone the SORT GitHub repository into ./
.
To run I-MuPPET on a predefined video sequence, run:
python muppet.py --plot_id --plot_pose --plot_tracker_bbox
The processed video will show the ID (--plot_id
), the pose (--plot_pose
) and the bounding box of the tracker (--plot_tracker_bbox
).
To run I-MuPPET on another pigeon video sequence, specify the video with --video
, e.g.:
python muppet.py --plot_id --plot_pose --plot_tracker_bbox --video '3p_2118670.avi'
To run I-MuPPET on the odor trail tracking data from DeepLabCut, use --species
to specify the species and --weights
to specify the pre-trained weights, e.g.:
python muppet.py --plot_id --plot_pose --plot_tracker_bbox --species 'mouse' --weights 'dlc_comparison/mouse_split_1' --video 'm3v1mp4.mp4'
To run I-MuPPET in full screen, use --full_screen
. To end video processing, press the "q" key on your keyboard.
To store the tracking data required for quantitative evaluation, use --write_tracking_data
. The detector and tracker data will be stored in ./data/tracking/gt/
and ./data/tracking/trackers/
respectively in the MOTChallenge (2D MOT 15) format.
Preliminary task
From this PyTorch GitHub repository download "coco_eval.py", "coco_utils.py", "engine.py" and "utils.py" and place them under ./utils/
.
Every experiment is defined by a configuration file. Configuration files with experiments from the paper can be found in ./experiments/
.
To train I-MuPPET on our labeled single pigeon data, run e.g.:
python train.py --config './experiments/muppet_600.yaml'
The training will start with the configuration file specified by --config
. The new weights will be stored in ./data/weights/my_weights/
.
To train I-MuPPET on the odor trail tracking data (mice) from DeepLabCut, run e.g.:
python train.py --config './experiments/dlc_comparison/mouse_split_1.yaml'
Before training, download "geometry.py", "img_utils.py" and "renderer.py" from the "3D Bird Reconstruction" GitHub Repository and place them under ./data/utils/
.
To train I-MuPPET on the cowbird data from "3D Bird Reconstruction", run e.g.:
python train.py --config './experiments/3dbr_comparison/cowbird_45_epochs.yaml'
To display some samples of the data sets with their annotations, use --display_data
, e.g.:
python train.py --config './experiments/muppet_600.yaml' --display_data
To quit, press the "q" key on your keyboard.
@inproceedings{waldmann2022imuppet,
title={I-MuPPET: Interactive Multi-Pigeon Pose Estimation and Tracking},
author={Waldmann, Urs and Naik, Hemal and M\'{a}t\'{e}, Nagy and Kano, Fumihiro and Couzin, Iain D. and Deussen, Oliver and Goldl\"{u}cke, Bastian},
booktitle={DAGM German Conference on Pattern Recognition},
year={2022},
pages={513--528}
}