MUST-GAN
Code | paper
The Pytorch implementation of our CVPR2021 paper "MUST-GAN: Multi-level Statistics Transfer for Self-driven Person Image Generation".
Tianxiang Ma, Bo Peng, Wei Wang, Jing Dong,
CRIPAC,NLPR,CASIA & University of Chinese Academy of Sciences.
Test results of our model under self-supervised training:
Pose transfer
Clothes style transfer
Requirement
- python3
- pytorch 1.1.0
- numpy
- scipy
- scikit-image
- pillow
- pandas
- tqdm
- dominate
- visdom
Getting Started
Installation
- Clone this repo:
git clone https://github.com/TianxiangMa/MUST-GAN.git
cd MUST-GAN
Data Preperation
We train and test our model on Deepfashion dataset. Especially, we utilize High-Res Images in the In-shop Clothes Retrieval Benchmark.
Download this dataset and unzip (You will need to ask for password.) it, then put the folder img_highres under the ./datasets
directory. Download train/test split list, which are used by a lot of methods, and put them under ./datasets
directory.
- Run the following code to split train/test dataset.
python tool/generate_fashion_datasets.py
Download source-target paired images list, as same as the list used by many previous work. Becouse our method can self-supervised training, we do not need the fashion-resize-pairs-train.csv, you can download train_images_lst.csv for training.
Download train/test keypoints annotation files and semantic segmentation files.
Put all the above files into the ./datastes
folder.
- Run the following code to generate pose map and pose connection map.
python tool/generate_pose_map.py
python tool/generate_pose_connection_map.py
Download vgg pretrained model for training, and put it into ./datasets
folder.
Test
Download our pretrained model, and put it into ./check_points/MUST-GAN/
folder.
- Run the following code, and set the parameters as your need.
bash scripts/test.sh
Train
- Run the following code, and set the parameters as your need.
bash scripts/train.sh
Citation
If you use this code for your research, please cite our paper:
@InProceedings{Ma_2021_CVPR,
author = {Ma, Tianxiang and Peng, Bo and Wang, Wei and Dong, Jing},
title = {MUST-GAN: Multi-Level Statistics Transfer for Self-Driven Person Image Generation},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2021},
pages = {13622-13631}
}
Acknowledgments
Our code is based on PATN and ADGAN, thanks for their great work.