WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution
This code belongs to the paper [1] available at https://arxiv.org/abs/2201.08157. Please cite the paper, if you use this code.
The paper [1] is The repository contains an implementation of WPPNets as introduced in [1]. It contains scripts for reproducing the numerical example Texture superresolution in Section 5.2.
Moreover, the file wgenpatex.py
is adapted from [2] available at https://github.com/johertrich/Wasserstein_Patch_Prior and is adapted from [3]. Furthermore, the folder model
is adapted from [5] available at https://github.com/hellloxiaotian/ACNet.
The folders test_img
and training_img
contain parts of the textures from [4].
For questions and bug reports, please contact Fabian Altekrueger (fabian.altekrueger(at)hu-berlin.de).
CONTENTS
- REQUIREMENTS
- USAGE AND EXAMPLES
- REFERENCES
1. REQUIREMENTS
The code requires several Python packages. We tested the code with Python 3.9.7 and the following package versions:
- pytorch 1.10.0
- matplotlib 3.4.3
- numpy 1.21.2
- pykeops 1.5
Usually the code is also compatible with some other versions of the corresponding Python packages.
2. USAGE AND EXAMPLES
You can start the training of the WPPNet by calling the scripts. If you want to load the existing network, please set retrain
to False
. Checkpoints are saved automatically during training such that the progress of the reconstructions is observable. Feel free to vary the parameters and see what happens.
TEXTURE GRASS
The script run_grass.py
is the implementation of the superresolution example in [1, Section 5.2] for the Kylberg Texture [4] grass
which is available at https://kylberg.org/kylberg-texture-dataset-v-1-0. The high-resolution ground truth and the reference image are different 600×600 sections cropped from the original texture images. Similarly, the low-resolution training data is generated by cropping 100×100 sections from the texture images, artificially downsampling it by a predefined forward operator f and adding Gaussian noise. For more details on the downsampling process, see [1, Section 5.2].
TEXTURE FLOOR
The script run_floor.py
is the implementation of the superresolution example in [1, Section 5.2] for the Kylberg Texture [4] Floor
which is available at https://kylberg.org/kylberg-texture-dataset-v-1-0. The high-resolution ground truth and the reference image are different 600×600 sections cropped from the original texture images. Similarly, the low-resolution training data is generated by cropping 100×100 sections from the texture images, artificially downsampling it by a predefined forward operator f and adding Gaussian noise. For more details on the downsampling process, see [1, Section 5.2].
3. REFERENCES
[1] F. Altekrueger, J. Hertrich.
WPPNets: Unsupervised CNN Training with Wasserstein Patch Priors for Image Superresolution.
ArXiv Preprint#2201.08157
[2] J. Hertrich, A. Houdard and C. Redenbach.
Wasserstein Patch Prior for Image Superresolution.
ArXiv Preprint#2109.12880
[3] A. Houdard, A. Leclaire, N. Papadakis and J. Rabin.
Wasserstein Generative Models for Patch-based Texture Synthesis.
ArXiv Preprint#2007.03408
[4] G. Kylberg.
The Kylberg texture dataset v. 1.0.
Centre for Image Analysis, Swedish University of Agricultural Sciences and Uppsala University, 2011
[5] C. Tian, Y. Xu, W. Zuo, C.-W. Lin, and D. Zhang.
Asymmetric CNN for image superresolution.
IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2021.