SiamGAT
1. Environment setup
This code has been tested on Ubuntu 16.04, Python 3.5, Pytorch 1.2.0, CUDA 9.0. Please install related libraries before running this code:
pip install -r requirements.txt
2. Test
Download the pretrained model and put them into tools/snapshot
directory.
From BaiduYun:
- otb_uav_model extract code: w1rs
- got10k_model extract code: n91w
- lasot_model extract code: dilp
- TrackingNet_model extract code: n2sm
From Google Driver:
Download testing datasets and put them into test_dataset
directory. Jsons of commonly used datasets can be downloaded from BaiduYun. If you want to test the tracker on a new dataset, please refer to pysot-toolkit to set test_dataset.
The tracking result can be download from BaiduYun (extract code: 0wod) or GoogleDriver for comparision.
python testTracker.py \
--config ../experiments/siamgat_googlenet_otb_uav/config.yaml \
--dataset UAV123 \ # dataset_name
--snapshot snapshot/otb_uav_model.pth # tracker_name
The testing result will be saved in the results/dataset_name/tracker_name
directory.
3. Train
Prepare training datasets
Download the datasets:
Note: training_dataset/dataset_name/readme.md
has listed detailed operations about how to generate training datasets.
Download pretrained backbones
Download pretrained backbones from link and put them into pretrained_models
directory.
Train a model
To train the SiamGAT model, run train.py
with the desired configs:
cd tools
python train.py
4. Evaluation
We provide the tracking results (extract code: 0wod) (results in Google driver) of GOT-10k, LaSOT, OTB100 and UAV123. If you want to evaluate the tracker on OTB100, UAV123 and LaSOT, please put those results into results
directory. Evaluate GOT-10k on Server.
Get TrackingNet results from BaiduYun (extract code: iwlj), and evaluate it on Server.
python eval.py \
--tracker_path ./results \ # result path
--dataset UAV123 \ # dataset_name
--tracker_prefix 'otb_uav_model' # tracker_name
5. Acknowledgement
The code is implemented based on pysot and SiamCAR. We would like to express our sincere thanks to the contributors.
6. Cite
If you use SiamGAT in your work please cite our papers:
@InProceedings{Guo_2021_CVPR,
author = {Guo, Dongyan and Shao, Yanyan and Cui, Ying and Wang, Zhenhua and Zhang, Liyan and Shen, Chunhua},
title = {Graph Attention Tracking},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2021}
}
@InProceedings{Guo_2020_CVPR,
author = {Guo, Dongyan and Wang, Jun and Cui, Ying and Wang, Zhenhua and Chen, Shengyong},
title = {SiamCAR: Siamese Fully Convolutional Classification and Regression for Visual Tracking},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}