Fork from https://github.com/naver/FIRe, removing dependences of several submodules via adding these moudules to lib with path setting.
This repo is integrated to Hierarchical-Localization pipline by adding hloc/extractors/fire.py interface.
This repository contains the code for running FIRe model presented in our ICLR'22 paper:
@inproceedings{superfeatures,
title={{Learning Super-Features for Image Retrieval}},
author={{Weinzaepfel, Philippe and Lucas, Thomas and Larlus, Diane and Kalantidis, Yannis}},
booktitle={{ICLR}},
year={2022}
}
The code is distributed under the CC BY-NC-SA 4.0 License. See LICENSE for more information. It is based on code from HOW, cirtorch and ASMK that are released under their own license, the MIT license.
- install ASMK
pip3 install pyaml numpy faiss-gpu
cd lib/asmk
python3 setup.py build_ext --inplace
rm -r build
- install dependencies by running:
pip3 install -r requirements.txt
- data/experiments folders
All data will be stored under a folder fire_data
that will be created when running the code; similarly, results and models from all experiments will be stored under folder fire_experiments
To evaluate on ROxford/RParis our model trained on SfM-120k, simply run
python evaluate.py eval_fire.yml
With the released model and the parameters found in eval_fire.yml
, we obtain 90.3 on the validation set, 82.6 and 62.2 on ROxford medium and hard respectively, 85.2 and 70.0 on RParis medium and hard respectively.
Simply run
python train.py train_fire.yml -e train_fire
All training outputs will be saved to fire_experiments/train_fire
.
To evaluate the trained model that was saved in fire_experiments/train_fire
, simply run:
python evaluate.py eval_fire.yml -e train_fire -ml train_fire
For reproducibility, we provide the following model weights for the architecture we use in the paper (ResNet50 without the last block + LIT):
- Model pre-trained on ImageNet-1K (with Cross-Entropy, the pre-trained model we use for training FIRe) (link) or fire_imagenet.pth
- Model trained on SfM-120k trained with FIRe (link) or fire_SfM_120k.pth
They will be automatically downloaded when running the training / testing script.