Keras implementations of Generative Adversarial Networks.

Overview

This repository has gone stale as I unfortunately do not have the time to maintain it anymore. If you would like to continue the development of it as a collaborator send me an email at [email protected].

Keras-GAN

Collection of Keras implementations of Generative Adversarial Networks (GANs) suggested in research papers. These models are in some cases simplified versions of the ones ultimately described in the papers, but I have chosen to focus on getting the core ideas covered instead of getting every layer configuration right. Contributions and suggestions of GAN varieties to implement are very welcomed.

See also: PyTorch-GAN

Table of Contents

Installation

$ git clone https://github.com/eriklindernoren/Keras-GAN
$ cd Keras-GAN/
$ sudo pip3 install -r requirements.txt

Implementations

AC-GAN

Implementation of Auxiliary Classifier Generative Adversarial Network.

Code

Paper: https://arxiv.org/abs/1610.09585

Example

$ cd acgan/
$ python3 acgan.py

Adversarial Autoencoder

Implementation of Adversarial Autoencoder.

Code

Paper: https://arxiv.org/abs/1511.05644

Example

$ cd aae/
$ python3 aae.py

BiGAN

Implementation of Bidirectional Generative Adversarial Network.

Code

Paper: https://arxiv.org/abs/1605.09782

Example

$ cd bigan/
$ python3 bigan.py

BGAN

Implementation of Boundary-Seeking Generative Adversarial Networks.

Code

Paper: https://arxiv.org/abs/1702.08431

Example

$ cd bgan/
$ python3 bgan.py

CC-GAN

Implementation of Semi-Supervised Learning with Context-Conditional Generative Adversarial Networks.

Code

Paper: https://arxiv.org/abs/1611.06430

Example

$ cd ccgan/
$ python3 ccgan.py

CGAN

Implementation of Conditional Generative Adversarial Nets.

Code

Paper:https://arxiv.org/abs/1411.1784

Example

$ cd cgan/
$ python3 cgan.py

Context Encoder

Implementation of Context Encoders: Feature Learning by Inpainting.

Code

Paper: https://arxiv.org/abs/1604.07379

Example

$ cd context_encoder/
$ python3 context_encoder.py

CoGAN

Implementation of Coupled generative adversarial networks.

Code

Paper: https://arxiv.org/abs/1606.07536

Example

$ cd cogan/
$ python3 cogan.py

CycleGAN

Implementation of Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks.

Code

Paper: https://arxiv.org/abs/1703.10593

Example

$ cd cyclegan/
$ bash download_dataset.sh apple2orange
$ python3 cyclegan.py

DCGAN

Implementation of Deep Convolutional Generative Adversarial Network.

Code

Paper: https://arxiv.org/abs/1511.06434

Example

$ cd dcgan/
$ python3 dcgan.py

DiscoGAN

Implementation of Learning to Discover Cross-Domain Relations with Generative Adversarial Networks.

Code

Paper: https://arxiv.org/abs/1703.05192

Example

$ cd discogan/
$ bash download_dataset.sh edges2shoes
$ python3 discogan.py

DualGAN

Implementation of DualGAN: Unsupervised Dual Learning for Image-to-Image Translation.

Code

Paper: https://arxiv.org/abs/1704.02510

Example

$ cd dualgan/
$ python3 dualgan.py

GAN

Implementation of Generative Adversarial Network with a MLP generator and discriminator.

Code

Paper: https://arxiv.org/abs/1406.2661

Example

$ cd gan/
$ python3 gan.py

InfoGAN

Implementation of InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets.

Code

Paper: https://arxiv.org/abs/1606.03657

Example

$ cd infogan/
$ python3 infogan.py

LSGAN

Implementation of Least Squares Generative Adversarial Networks.

Code

Paper: https://arxiv.org/abs/1611.04076

Example

$ cd lsgan/
$ python3 lsgan.py

Pix2Pix

Implementation of Image-to-Image Translation with Conditional Adversarial Networks.

Code

Paper: https://arxiv.org/abs/1611.07004

Example

$ cd pix2pix/
$ bash download_dataset.sh facades
$ python3 pix2pix.py

PixelDA

Implementation of Unsupervised Pixel-Level Domain Adaptation with Generative Adversarial Networks.

Code

Paper: https://arxiv.org/abs/1612.05424

MNIST to MNIST-M Classification

Trains a classifier on MNIST images that are translated to resemble MNIST-M (by performing unsupervised image-to-image domain adaptation). This model is compared to the naive solution of training a classifier on MNIST and evaluating it on MNIST-M. The naive model manages a 55% classification accuracy on MNIST-M while the one trained during domain adaptation gets a 95% classification accuracy.

$ cd pixelda/
$ python3 pixelda.py
Method Accuracy
Naive 55%
PixelDA 95%

SGAN

Implementation of Semi-Supervised Generative Adversarial Network.

Code

Paper: https://arxiv.org/abs/1606.01583

Example

$ cd sgan/
$ python3 sgan.py

SRGAN

Implementation of Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network.

Code

Paper: https://arxiv.org/abs/1609.04802

Example

$ cd srgan/
<follow steps at the top of srgan.py>
$ python3 srgan.py

WGAN

Implementation of Wasserstein GAN (with DCGAN generator and discriminator).

Code

Paper: https://arxiv.org/abs/1701.07875

Example

$ cd wgan/
$ python3 wgan.py

WGAN GP

Implementation of Improved Training of Wasserstein GANs.

Code

Paper: https://arxiv.org/abs/1704.00028

Example

$ cd wgan_gp/
$ python3 wgan_gp.py

Comments
  • Is the objective correct?

    Is the objective correct?

    Sorry for the lay question but is the objective of these GANs in accord with the original paper?

    In the original paper, they seem to be minimizing the log(prob_real)+log(1-prob_fakes); but in most Keras implementations I find on the internet people train the discriminator with binary cross entropy. Does this end up being the same, mathematically?

    opened by gustavoeb 12
  • a question about gcgan network

    a question about gcgan network

    hello,Erik, thanks for your sharing of these kinds of GAN, and I wonder if my data is 3 channel color images like cifar, how to apply it, for if I just change the size and channel it arise an error 'number of input channels does not match corresponding dimension of filter, 1 != 3'

    opened by 0AlvinLO0 9
  • Concatenate

    Concatenate

    I got the following error when I tried to apply the code to my own data set:

    ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 64, 64, 128), (None, 63, 63, 128)]

    Any assistance will be greatly appreciated.

    opened by MinuteswithMetrics 8
  • puzzled about

    puzzled about "discriminator.trainable=False"

    in pix2pix.py:

    self.discriminator = self.build_discriminator() self.discriminator.compile(loss='mse', optimizer=optimizer, metrics=['accuracy'])

    # For the combined model we will only train the generator self.discriminator.trainable = False # Discriminators determines validity of translated images / condition pairs valid = self.discriminator([fake_A, img_B]) self.combined = Model(inputs=[img_A, img_B], outputs=[valid, fake_A])

    The D network part of the combined model has same instance with real D network(not deep copy), and set discriminator.trainable = False will disable both combined model and real D model trainable. So shall there has a switch to switch enable/disable the discriminator model when it's on training ?

    opened by qinwang-ai 7
  • Bidirectional GAN seems incomplete

    Bidirectional GAN seems incomplete

    Searching in Google for some Keras implementation for Bidirectional GAN led me here and I copied your code. But there seems to be a problem; the code never trains the encoder network! How is it supposed to learn the distribution? Then I thought that there might be some underlying codes training it, so I tweaked the 'train' method so it would return the generator and the encoder models so I could check if the encoder has been trained. I tried to run the below code, which is part of the training loop:

        # Sample noise and generate img
        z = np.random.normal(size=(batch_size, self.latent_dim))
        imgs_ = self.generator.predict(z)
        # Select a random batch of images and encode
        idx = np.random.randint(0, X_train.shape[0], batch_size)
        imgs = X_train[idx]
        z_ = self.encoder.predict(imgs)
    

    But then this happend:

    ValueError: Error when checking input: expected input_4 to have 4 dimensions, but got array with shape (32, 28, 28)
    

    I honestly don't know what's wrong! If this code worked fine in the loop why isn't it working outside of it? I did load the mnist dataset. And I still don't know if the encoder gets trained...

    opened by farahmand-m 7
  • CycleGan

    CycleGan

    When I running your code for Cycle-Gan for some dataset it's working good like cityscapes,apple2orange but for some it's showing the following error .Please help me to resolve . And what can change in your code to see the result in Google Colob since don't have GPU

    Traceback (most recent call last): File "cyclegan.py", line 258, in <module> gan.train(epochs=200, batch_size=1, sample_interval=200) File "cyclegan.py", line 170, in train for batch_i, (imgs_A, imgs_B) in enumerate(self.data_loader.load_batch(batch_size)): File "/content/project1211/CycleGan/project3/project344/Keras-GAN/cyclegan/data_loader.py", line 42, in load_batch path_A = np.random.choice(path_A, total_samples, replace=False) File "mtrand.pyx", line 1126, in mtrand.RandomState.choice ValueError: a must be non-empty

    opened by parasjain-12 7
  • MAE instead of L1 loss in pix2pix?

    MAE instead of L1 loss in pix2pix?

    I noticed from your implementation of pix2pix, you use MAE as the second loss in here

    self.combined.compile(loss=['mse', 'mae'],
                          loss_weights=[1, 100],
                          optimizer=optimizer)
    

    But from the original implementation at junyanz/pytorch-CycleGAN-and-pix2pix , they use L1 as the second loss. Is there a reason to your decision using MAE?

    opened by miqbal23 6
  • The labels of valid and fake in wgan

    The labels of valid and fake in wgan

    Hello, I have a question about the labels of valid and fake. Generally, we need set the valid array are filled of 1, and the fake array are filled of 0. So I don't know why in wgan, author set the valid -1 and the fake 1. I will appreciate for everyone who help me. # Adversarial ground truths valid = -np.ones((batch_size, 1)) fake = np.ones((batch_size, 1))

    opened by Pandarenlql 6
  • Please update scipy version and other Python packages

    Please update scipy version and other Python packages

    Hi Erik,

    I am encountering a problem with scipy method named imread, which I believe it is deprecated followed the recent version 1.0.0. Probably, this would work with earlier version such as 0.19.0 however, it still causes a lot of conflictions for Keras and Tensorflow.

    I am using Ananconda with Python 3.6.4. Please see more detail as follows

    Using TensorFlow backend.
    Traceback (most recent call last):
      File "cyclegan.py", line 244, in <module>
        gan.train(epochs=30000, batch_size=2, save_interval=200)
      File "cyclegan.py", line 161, in train
        imgs_A = self.data_loader.load_data(domain="A", batch_size=half_batch)
      File "/home/emma/Research/GAN/keras_GAN/Keras-GAN/cyclegan/data_loader.py", line 18, in load_data
        img = self.imread(img_path)
      File "/home/emma/Research/GAN/keras_GAN/Keras-GAN/cyclegan/data_loader.py", line 39, in imread
        return scipy.misc.imread(path, mode='RGB').astype(np.float)
    AttributeError: module 'scipy.misc' has no attribute 'imread'
    

    Thanks very much,

    Emma

    opened by EmmaNguyen 6
  • Why 'self.num_classes+1''???

    Why 'self.num_classes+1''???

    in acgan.py:

     label = Dense(self.num_classes+1, activation="softmax")(features)
    

    Why self.num_classes+1 instead of self.num_classe???

    I search this keyword self.num_classes+1 in google and found that you are the first and only one to use it instead of self.num_classe. Could you please give a explanation?

    opened by Jingnan-Jia 5
  • code usage in ACGAN

    code usage in ACGAN

    label = Input(shape=(1,), dtype='int32') label_embedding = Flatten()(Embedding(self.num_classes, 100)(label)) model_input = multiply([noise, label_embedding]) img = model(model_input)

    what's the meaning of these lines in ACGAN code?

    Thank you

    opened by shi18 5
  • Update README.md

    Update README.md

    Your project is featured on kandi. kandi kits help developers shortlist reusable libraries and code snippets for specific topics or use cases. Approve your kandi badge to help more developers discover and adopt your project easily. Thanks!

    opened by NaveenDev5 0
  • Project dependencies may have API risk issues

    Project dependencies may have API risk issues

    Hi, In Keras-GAN, inappropriate dependency versioning constraints can cause risks.

    Below are the dependencies and version constraints that the project is using

    keras*
    git+https://www.github.com/keras-team/keras-contrib.git
    matplotlib*
    numpy*
    scipy*
    pillow*
    scikit-image*
    

    The version constraint == will introduce the risk of dependency conflicts because the scope of dependencies is too strict. The version constraint No Upper Bound and * will introduce the risk of the missing API Error because the latest version of the dependencies may remove some APIs.

    After further analysis, in this project, The version constraint of dependency keras can be changed to >=0.2.0,<=2.3.1. The version constraint of dependency scipy can be changed to >=0.9.0,<=1.7.3.

    The above modification suggestions can reduce the dependency conflicts as much as possible, and introduce the latest version as much as possible without calling Error in the projects.

    The invocation of the current project includes all the following methods.

    The calling methods from the keras
    cifar10.load_data
    mnist.load_data
    
    The calling methods from the scipy
    scipy.ndimage.interpolation.rotate
    
    The calling methods from the all methods
    model.add
    test_accs.append
    PixelDA
    self.discriminator.compile
    y_train.reshape
    self.g_AB
    self.build_disk_and_q_net
    strides.filters.Conv2D
    plt.figure
    self.build_critic
    os.path.exists
    self.sample_generator_input
    np.random.random
    Flatten
    imgs.append
    self.setup_mnistm
    range
    batch_size.self.num_classes.np.random.randint.reshape
    DiscoGAN
    self.G_BA.predict
    ACGAN
    self.D_A.compile
    self.critic.compile
    _x2._x1._y2._y1.masked_img.copy
    path.scipy.misc.imread.astype
    context_encoder.train
    y_train.flatten
    X_B.reshape
    CycleGAN
    self.build_vgg
    vgg
    Pix2Pix
    list
    open
    self.critic_model.train_on_batch
    self.d_A.compile
    len
    self.encoder
    l.set_weights
    np.arange
    enumerate
    dropout_rate.Dropout
    model
    cifar10.load_data
    fig.savefig
    self.generator_model.compile
    infogan.train
    K.gradients
    sgan.train
    d_block
    CGAN
    plt.imshow
    self.build_encoder
    np.argmax
    f.write
    self.G_AB
    scipy.misc.imread
    imresize
    i.imgs.copy
    self.sample_images
    self.generator
    Sequential
    np.random.randint
    np.random.choice
    ccgan.train
    Input
    self.img_shape.np.prod.self.num_classes.Embedding
    RandomWeightedAverage
    COGAN
    self.sample_interval
    self.d_B.train_on_batch
    K.log
    batch_size.np.random.randint.reshape
    acgan.train
    filters.Conv2D
    to_categorical
    self.latent_dim.Dense
    np.expand_dims
    self.discriminator
    os.unlink
    K.square
    WGANGP
    options.open.write
    clf_layer
    K.sqrt
    save
    K.random_normal
    self.g_AB.predict
    self.mnist_y.copy
    bigan.train
    self.G_BA
    LeakyReLU
    dcgan.train
    zip_f.read
    img.copy
    self.data_loader.load_data
    self.d2
    datetime.datetime.now
    self.clf
    data.read
    self.build_generators
    self.d1.compile
    images.astype
    self.build_discriminator
    self.img_shape.np.prod.Dense
    DCGAN
    self.g_BA.predict
    np.random.normal
    self.D_A.train_on_batch
    SRGAN
    l.get_weights
    self.d_B.compile
    self.critic
    test_accs.pop
    np.repeat
    self.generator.predict
    np.add
    mnist.load_data
    j.i.axs.imshow
    plt.subplots
    self.d2.compile
    VGG19
    min
    self.num_classes.Dense
    self.vgg.compile
    self.g2
    col.row.axs.set_title
    self.g1.predict
    np.full
    Model
    scipy.misc.imresize
    self.adversarial_autoencoder.train_on_batch
    self.d_B
    self.save_model
    cgan.train
    concatenate
    ContextEncoder
    glob
    DUALGAN
    bgan.train
    np.clip
    self.df.Dense
    Dense
    gzip.GzipFile
    self.combined.compile
    np.vstack
    K.shape
    self.build_classifier
    gan.train
    self.critic.train_on_batch
    DataLoader
    self.d_A
    zip
    np.array
    i.axs.axis
    self.d2.train_on_batch
    f_size.filters.Conv2D
    print
    plt.close
    self.save_imgs
    deconv2d
    self.g2.predict
    self.build_decoder
    self.mask_randomly
    float
    self.bigan_generator.compile
    Dropout
    os.makedirs
    np.mean
    filepath.replace
    Add
    X_train.astype
    self.decoder.predict
    Concatenate
    K.exp
    self.bigan_generator.train_on_batch
    self.latent_dim.self.num_classes.Embedding
    self.D_A
    RMSprop
    urllib.request.urlopen
    AdversarialAutoencoder
    i.j.axs.imshow
    self.g1
    model.summary
    self.decoder
    np.empty
    j.i.axs.set_title
    UpSampling2D
    self.d1
    self.generator_model.train_on_batch
    SGAN
    self.channels.Conv2D
    self.vgg
    CCGAN
    K.random_uniform
    self.img_shape.Reshape
    self.build_generator
    imgs_B.append
    i.j.axs.axis
    Embedding
    aae.train
    Adam
    self.imread
    multiply
    self.d1.train_on_batch
    np.save
    model.to_json
    self.discriminator.train_on_batch
    imgs_lr.append
    j.i.axs.axis
    pickle.load
    np.concatenate
    np.where
    np.zeros
    d_layer
    Activation
    self.d_A.train_on_batch
    WGAN
    self.clf.predict
    scipy.ndimage.interpolation.rotate
    self.g_BA
    gen_imgs.reshape
    self.D_B.train_on_batch
    partial
    ZeroPadding2D
    Conv2D
    self.setup_mnist
    self.G_AB.predict
    residual_block
    self.encoder.predict
    self.build_discriminators
    model.save_weights
    int
    i.axs.imshow
    GAN
    self.adversarial_autoencoder.compile
    BIGAN
    LSGAN
    np.empty_like
    self.normalize
    self.D_B
    imgs_A.append
    wgan.train
    np.prod
    conv2d
    self.D_B.compile
    X_A.reshape
    col.row.axs.imshow
    self.data_loader.load_batch
    K.mean
    np.load
    self.auxilliary
    col.row.axs.axis
    self.auxilliary.compile
    BatchNormalization
    imgs_hr.append
    BGAN
    np.arange.reshape
    out_f.write
    K.sum
    np.fliplr
    Reshape
    self.critic_model.compile
    np.ones
    self.combined.train_on_batch
    self.vgg.predict
    INFOGAN
    InstanceNormalization
    merge
    

    @woctezuma Could please help me check this issue? May I pull a request to fix it? Thank you very much.

    opened by PyDeps 0
  • Where can I find and download some pre-trained model?

    Where can I find and download some pre-trained model?

    Hello: As training models using python, there are all kinds of errors, too many packages have different versions, so training any model will take forever. So, I want to know where and how I can download some pre-trained models in Keras, like: CycleGAN model. Please advise, Thanks,

    opened by zydjohnHotmail 0
Owner
Erik Linder-Norén
ML engineer at Apple. Excited about machine learning, basketball and building things.
Erik Linder-Norén
Collection of TensorFlow2 implementations of Generative Adversarial Network varieties presented in research papers.

TensorFlow2-GAN Collection of tf2.0 implementations of Generative Adversarial Network varieties presented in research papers. Model architectures will

null 41 Apr 28, 2022
Image-to-Image Translation with Conditional Adversarial Networks (Pix2pix) implementation in keras

pix2pix-keras Pix2pix implementation in keras. Original paper: Image-to-Image Translation with Conditional Adversarial Networks (pix2pix) Paper Author

William Falcon 141 Dec 30, 2022
Minimal PyTorch implementation of Generative Latent Optimization from the paper "Optimizing the Latent Space of Generative Networks"

Minimal PyTorch implementation of Generative Latent Optimization This is a reimplementation of the paper Piotr Bojanowski, Armand Joulin, David Lopez-

Thomas Neumann 117 Nov 27, 2022
[ICLR 2021, Spotlight] Large Scale Image Completion via Co-Modulated Generative Adversarial Networks

Large Scale Image Completion via Co-Modulated Generative Adversarial Networks, ICLR 2021 (Spotlight) Demo | Paper [NEW!] Time to play with our interac

Shengyu Zhao 373 Jan 2, 2023
Regularizing Generative Adversarial Networks under Limited Data (CVPR 2021)

Regularizing Generative Adversarial Networks under Limited Data [Project Page][Paper] Implementation for our GAN regularization method. The proposed r

Google 148 Nov 18, 2022
NR-GAN: Noise Robust Generative Adversarial Networks

NR-GAN: Noise Robust Generative Adversarial Networks (CVPR 2020) This repository provides PyTorch implementation for noise robust GAN (NR-GAN). NR-GAN

Takuhiro Kaneko 59 Dec 11, 2022
HiFi-GAN: Generative Adversarial Networks for Efficient and High Fidelity Speech Synthesis

HiFi-GAN: Generative Adversarial Networks for Efficient and High Fidelity Speech Synthesis Jungil Kong, Jaehyeon Kim, Jaekyoung Bae In our paper, we p

Rishikesh (ऋषिकेश) 31 Dec 8, 2022
Generating Anime Images by Implementing Deep Convolutional Generative Adversarial Networks paper

AnimeGAN - Deep Convolutional Generative Adverserial Network PyTorch implementation of DCGAN introduced in the paper: Unsupervised Representation Lear

Rohit Kukreja 23 Jul 21, 2022
π-GAN: Periodic Implicit Generative Adversarial Networks for 3D-Aware Image Synthesis

π-GAN: Periodic Implicit Generative Adversarial Networks for 3D-Aware Image Synthesis Project Page | Paper | Data Eric Ryan Chan*, Marco Monteiro*, Pe

null 375 Dec 31, 2022
Unofficial implementation of Alias-Free Generative Adversarial Networks. (https://arxiv.org/abs/2106.12423) in PyTorch

alias-free-gan-pytorch Unofficial implementation of Alias-Free Generative Adversarial Networks. (https://arxiv.org/abs/2106.12423) This implementation

Kim Seonghyeon 502 Jan 3, 2023
Image Deblurring using Generative Adversarial Networks

DeblurGAN arXiv Paper Version Pytorch implementation of the paper DeblurGAN: Blind Motion Deblurring Using Conditional Adversarial Networks. Our netwo

Orest Kupyn 2.2k Jan 1, 2023
Code for the paper "TadGAN: Time Series Anomaly Detection Using Generative Adversarial Networks"

TadGAN: Time Series Anomaly Detection Using Generative Adversarial Networks This is a Python3 / Pytorch implementation of TadGAN paper. The associated

Arun 92 Dec 3, 2022
Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary Differential Equations

ODE GAN (Prototype) in PyTorch Partial implementation of ODE-GAN technique from the paper Training Generative Adversarial Networks by Solving Ordinary

Somshubra Majumdar 15 Feb 10, 2022
Pytorch implementation for reproducing StackGAN_v2 results in the paper StackGAN++: Realistic Image Synthesis with Stacked Generative Adversarial Networks

StackGAN-v2 StackGAN-v1: Tensorflow implementation StackGAN-v1: Pytorch implementation Inception score evaluation Pytorch implementation for reproduci

Han Zhang 809 Dec 16, 2022
Code for "On the Effects of Batch and Weight Normalization in Generative Adversarial Networks"

Note: this repo has been discontinued, please check code for newer version of the paper here Weight Normalized GAN Code for the paper "On the Effects

Sitao Xiang 182 Sep 6, 2021
PyTorch implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"

DiscoGAN in PyTorch PyTorch implementation of Learning to Discover Cross-Domain Relations with Generative Adversarial Networks. * All samples in READM

Taehoon Kim 1k Jan 4, 2023
Official implementation of "Learning to Discover Cross-Domain Relations with Generative Adversarial Networks"

DiscoGAN Official PyTorch implementation of Learning to Discover Cross-Domain Relations with Generative Adversarial Networks. Prerequisites Python 2.7

SK T-Brain 754 Dec 29, 2022
A simple PyTorch Implementation of Generative Adversarial Networks, focusing on anime face drawing.

AnimeGAN A simple PyTorch Implementation of Generative Adversarial Networks, focusing on anime face drawing. Randomly Generated Images The images are

Jie Lei 雷杰 1.2k Jan 3, 2023
Alias-Free Generative Adversarial Networks (StyleGAN3) Official PyTorch implementation

Alias-Free Generative Adversarial Networks (StyleGAN3) Official PyTorch implementation

NVIDIA Research Projects 4.8k Jan 9, 2023