Skip to content

Freeze the Discriminator: a Simple Baseline for Fine-Tuning GANs (CVPRW 2020)

Notifications You must be signed in to change notification settings

sangwoomo/FreezeD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 

Repository files navigation

FreezeD: a Simple Baseline for Fine-tuning GANs

Update (2020/10/28)

Release checkpoints of StyleGAN fine-tuned on cat and dog datasets.

Update (2020/04/06)

Current code evaluates FID scores with inception.train() mode. Fixing it to inception.eval() may degrade the overall scores (both competitors and ours; hence the trend does not change). Thanks to @jychoi118 (Issue #3) for reporting this.


Official code for "Freeze the Discriminator: a Simple Baseline for Fine-Tuning GANs" (CVPRW 2020).

The code is heavily based on the StyleGAN-pytorch and SNGAN-projection-chainer codes.

See stylegan and projection directory for StyleGAN and SNGAN-projection experiments, respectively.

Note: There is a bug in PyTorch 1.4.0, hence one should use torch>=1.5.0 or torch<=1.3.0. See Issue #1.

Generated samples

Generated samples over fine-tuning FFHQ-pretrained StyleGAN

 

More generated samples (StyleGAN)

Generated samples under Animal Face and Anime Face datasets

   

   

   

   

   

   

   

   

   

   

More generated samples (SNGAN-projection)

Comparison of fine-tuning (left) and freeze D (right) under Oxford Flower, CUB-200-2011, and Caltech-256 datasets

Freeze D generates more class-consistent results (see row 2, 8 of Oxford Flower)

 

 

 

Citation

If you use this code for your research, please cite our papers.

@inproceedings{
    mo2020freeze,
    title={Freeze the Discriminator: a Simple Baseline for Fine-Tuning GANs},
    author={Mo, Sangwoo and Cho, Minsu and Shin, Jinwoo},
    booktitle = {CVPR AI for Content Creation Workshop},
    year={2020},
}

About

Freeze the Discriminator: a Simple Baseline for Fine-Tuning GANs (CVPRW 2020)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published