SCT
This is the official code for the paper "Tracker Meets Night: A Transformer Enhancer for UAV Tracking"
The spatial-channel Transformer (SCT) enhancer is a task-inspired low-light enhancer toward facilitating nighttime UAV tracking. Evaluations on the public UAVDark135 and the newly constructed DarkTrack2021 benchmarks demonstrate that the performance gains of SCT brought to nighttime UAV tracking surpass general low-light enhancers.
SCT has been submitted to RA-L with ICRA option.
Environment Preparing
python 3.6
pytorch 1.8.1
Testing
Run lowlight_test.py, the results will be saved in ./result/
cd SCT
python lowlight_test.py
Training
Before training, you need to prepare the training set of the LOL dataset. Run lowlight_train.py. The model will be saved in ./log/SCT/models
cd SCT
python lowlight_train.py --trainset_path /your/path/to/LOLdataset/
SCT for Nighttime UAV Tracking
To evaluate the performance of SCT in facilitating trackers' nighttime tracking ability, you need to meet the enviroment requirements of base trackers and download their snapshots to corresponding folders at first. Details can be found in their repos. Currently supporting trackers including HiFT, SiamAPN++, SiamRPN++, DiMP18, DiMP50, and PrDiMP50.
For HiFT, SiamAPN++, and SiamRPN++, change directory to their corresponding root, and simply run trackers with “--enhance” option
cd HiFT/SiamAPN++/pysot
python tools/test.py --dataset DarkTrack --enhance
For DiMP18, DiMP50, and PrDiMP50, customized your local paths in pytracking/evaluation/local.py
cd pytracking
python run_tracker.py --tracker_name dimp --tracker_param dimp18/dimp50/prdimp50 --enhance
DarkTrack2021 Benchmark
The DarkTrack2021 benchmark comprises 110 challenging sequences with 100K frames in total. All sequences are captured at nighttime in urban scenes with a frame-rate of 30 frames/s (FPS). Some first frames of selected sequences in DarkTrack2021 are displayed below.
DarkTrack2021 is now available here (password: a4lq).
Demo Video
Contact
Junjie Ye Email: [email protected]
Changhong Fu Email: [email protected]
Acknowledgements
A great thanks to Swin-Transformer for providing the basis for this code.