GenDA - One-Shot Generative Domain Adaptation
One-Shot Generative Domain Adaptation
Ceyuan Yang*, Yujun Shen*, Zhiyi Zhang, Yinghao Xu, Jiapeng Zhu, Zhirong Wu, Bolei Zhou
arXiv preprint
[Paper] [Project Page]
This work aims at transferring a Generative Adversarial Network (GAN) pre-trained on one image domain to a new domain referring to as few as just one target image. Different from existing approaches that adopt the vanilla fine-tuning strategy, we import two lightweight modules called attribute adaptor and attribute classifier to the generator and the discriminator respectively. By efficiently learning these two modules, we manage to reuse the prior knowledge and hence enable one-shot transfer with impressively high diversity. Our method demonstrates substantial improvements over existing baselines in a wide range of settings.
Qualitative results
Here we provide some synthesis after one-shot adaptation.
BibTeX
@article{yang2021genda,
title = {One-Shot Generative Domain Adaptation},
author = {Yang, Ceyuan and Shen, Yujun and Zhang, Zhiyi and Xu, Yinghao and Zhu, Jiapeng and Wu, Zhirong and Zhou, Bolei},
article = {arXiv preprint arXiv:2111.09876},
year = {2021}
}