Under construction...
Attention in Attention Network for Image Super-Resolution (A2N)
This repository is an PyTorch implementation of the paper
"Attention in Attention Network for Image Super-Resolution" [arXiv]
Visual results in the paper are availble at Google Drive or Baidu Netdisk (password: 7t74).
Unofficial TensorFlow implementation: https://github.com/Anuj040/superres
Test
Dependecies: torch==0.4.1
You can download the test sets from Google Drive. Put the test data in ../Data/benchmark/
.
python main.py --scale 4 --data_test Set5 --pre_train ./experiment/model/aan_x4.pt --chop --test_only
If you use CPU, please add "--cpu".
Train
Training data preparation
- Download DIV2K training data from DIV2K dataset or SNU_CVLab.
- Specify
'--dir_data'
in option.py based on the data path.
For more informaiton, please refer to EDSR(PyTorch).
Training
python main.py --scale 2 --patch_size 128 --reset --chop
python main.py --scale 3 --patch_size 192 --reset --chop
python main.py --scale 4 --patch_size 256 --reset --chop
Citation
If you have any question or suggestion, welcome to email me at here.
If you find our work helpful in your resarch or work, please cite the following papers.
@misc{chen2021attention,
title={Attention in Attention Network for Image Super-Resolution},
author={Haoyu Chen and Jinjin Gu and Zhi Zhang},
year={2021},
eprint={2104.09497},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
Acknowledgements
This code is built on EDSR (PyTorch) and PAN. We thank the authors for sharing their codes.