Generative-Models
This is a repository papers and code on different generative models.
GANs.
-
Original GAN: 'Generative Adversarial Networks' Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio. 2014. [Arxiv].
-
DCGAN: 'Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks' Alec Radford, Luke Metz, Soumith Chintala. 2016. [Arxiv]. [Code].
-
Cost function improvement proposals:
- CGAN: 'Conditional Generative Nets' Mehdi Mirza, Simon Osindero. 2014. [Arxiv]. [Code].
- ACGAN: 'Conditional Image Synthesis With Auxiliary Classifier GANs' Augustus Odena, Christopher Olah, Jonathon Shlens. 2016. [Arxiv]. [Code].
- InfoGAN: 'InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets' Xi Chen, Yan Duan, Rein Houthooft, John Schulman, Ilya Sutskever, Pieter Abbeel. 2016. [Arxiv]. [Code].
- LSGAN: 'Least Squares Generative Adversarial Networks' Xudong Mao, Qing Li, Haoran Xie, Raymond Y.K. Lau, Zhen Wang, Stephen Paul Smolley. 2017. [Arxiv]. [Code].
- WGAN: 'Wassertein GAN' Martin Arjovsky, Soumith Chintala, Léon Bottou. 2017. [Arxiv]. [Code].
- WGAN-GP: 'Improved Training of Wasserstein GANs' Ishaan Gulrajani, Faruk Ahmed, Martin Arjovsky, Vincent Dumoulin, Aaron Courville. 2017. [Arxiv]. [Code].
- EBGAN: 'Energy-based Generative Adversarial Network' Junbo Zhao, Michael Mathieu, Yann LeCun. 2016. [Arxiv]. [Code].
- BEGAN: 'BEGAN: Boundary Equilibrium Generative Adversarial Networks' David Berthelot, Thomas Schumm, Luke Metz . 2017. [Arxiv]. [Code].
- RSGAN & RaSGAN: 'The relativistic discriminator: a key element missing from standard GAN' Alexia Jolicoeur-Martineau. 2018. [Arxiv]. [RaSGAN Code]. [RaLSGAN Code]. [RaSGAN-GP Code].
- DRAGAN: 'On Convergence and Stability of GANs' Naveen Kodali, Jacob Abernethy, James Hays, Zsolt Kira . 2017. [Arxiv]. [Code].
- Spectral GAN: 'Spectral Normalization for Generative Adversarial Networks' Takeru Miyato, Toshiki Kataoka, Masanori Koyama, Yuichi Yoshida. 2018. OpenReview. [Code].
- BigGAN: 'Large Scale GAN Training for High Fidelity Natural Image Synthesis' Andrew Brock, Jeff Donahue, Karen Simonyan . 2018. [Arxiv]. [Code].
-
Network changes proposals:
- StackGAN: 'StackGAN: Text to Photo-realistic Image Synthesis with Stacked Generative Adversarial Networks' Han Zhang, Tao Xu, Hongsheng Li, Shaoting Zhang, Xiaogang Wang, Xiaolei Huang, Dimitris Metaxas. 2016. [Arxiv]. [Code].
- SaGAN: 'Self-Attention Generative Adversarial Networks' Han Zhang, Ian Goodfellow, Dimitris Metaxas, Augustus Odena . 2018. [Arxiv]. [Code].
- ProGAN: 'Progressive Growing of GANs for Improved Quality, Stability, and Variation' Tero Karras, Timo Aila, Samuli Laine, Jaakko Lehtinen. 2018. [Arxiv]. [Code].
- Style-GAN: 'A Style-Based Generator Architecture for Generative Adversarial Networks' Tero Karras, Samuli Laine, Timo Aila. 2018. [Arxiv]. [Code].
-
Applications:
- Cycle-GANs: 'Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks' Jun-Yan Zhu, Taesung Park, Phillip Isola, Alexei A. Efros. 2017. [Arxiv]. [Code].
- SRGANs: 'Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network' Christian Ledig, Lucas Theis, Ferenc Huszar, Jose Caballero, Andrew Cunningham, Alejandro Acosta, Andrew Aitken, Alykhan Tejani, Johannes Totz, Zehan Wang, Wenzhe Shi. 2016. [Arxiv]. [Code].
- ESRGAN: 'ESRGAN: Enhanced Super-Resolution Generative Adversarial Networks' Xintao Wang, Ke Yu, Shixiang Wu, Jinjin Gu, Yihao Liu, Chao Dong, Chen Change Loy, Yu Qiao, Xiaoou Tang. 2018. [Arxiv]. [Code].
- GAWWN: 'Learning What and Where to Draw' Scott Reed, Zeynep Akata, Santosh Mohan, Samuel Tenka, Bernt Schiele, Honglak Lee. 2016 [Arxiv]. [Code].
-
GAN Training & Studies:
- 'Improved Techniques for Training GANs' Tim Salimans, Ian Goodfellow, Wojciech Zaremba, Vicki Cheung, Alec Radford, Xi Chen. 2016. [Arxiv].
- 'Are GANs Created Equal? A Large-Scale Study' Mario Lucic, Karol Kurach, Marcin Michalski, Sylvain Gelly, Olivier Bousquet . 2017. [Arxiv].
- 'Virtual Adversarial Training: A Regularization Method for Supervised and Semi-Supervised Learning' Takeru Miyato, Shin-ichi Maeda, Masanori Koyama, Shin Ishii. 2017. [Arxiv].
VAEs.
- VAE:
- Wassertein VAE: 'Wasserstein Auto-Encoders' Ilya Tolstikhin, Olivier Bousquet, Sylvain Gelly, Bernhard Schoelkopf. 2018. [Arvix]. [Code].
- CVAE: 'Learning Structured Output Representation using Deep Conditional Generative Models' Sohn K, Yan X, Lee H, et al. 2014. [Arvix]. [Code].
- VAE-GAN: 'Autoencoding beyond pixels using a learned similarity metric' Anders Boesen Lindbo Larsen, Søren Kaae Sønderby, Hugo Larochelle, Ole Winther. 2015 [Arxiv].
Autoregressive Models.
- NADE: 'Neural Autoregressive Distribution Estimation' Benigno Uria, Marc-Alexandre Côté, Karol Gregor, Iain Murray, Hugo Larochelle. 2016. [Arxiv].
- RNADE: 'RNADE: The real-valued neural autoregressive density-estimator' Benigno Uria, Iain Murray, Hugo Larochelle. 2014. [Arxiv].
- MADE: 'MADE: Masked Autoencoder for Distribution Estimation' Mathieu Germain, Karol Gregor, Iain Murray, Hugo Larochelle. 2015. [Arxiv].
- Pixel RNN: 'Pixel Recurrent Neural Networks' Aaron van den Oord, Nal Kalchbrenner, Koray Kavukcuoglu. 2016. [Arxiv].
- PixelCNN: 'Conditional Image Generation with PixelCNN Decoders' Aaron van den Oord, Nal Kalchbrenner, Koray Kavukcuoglu. 2016. [Arxiv].
- PixelCNN++: 'PixelCNN++: Improving the PixelCNN with Discretized Logistic Mixture Likelihood and Other Modifications' Tim Salimans, Andrej Karpathy, Xi Chen, Diederik P. Kingma. 2017. [Arxiv].
- WaveNet: 'WaveNet: A Generative Model for Raw Audio' Aaron van den Oord, Sander Dieleman, Heiga Zen, Karen Simonyan, Oriol Vinyals, Alex Graves, Nal Kalchbrenner, Andrew Senior, Koray Kavukcuoglu. 2016. [Arxiv].
Normalizing Flows.
- Glow: 'Glow: Generative Flow with Invertible 1x1 Convolutions' Kingma D Dhariwal P. 2018. [Arxiv].
- NICE:'NICE: Non-linear Independent Components Estimation' Dinh L Krueger D Bengio Y. 2014. [Arxiv].
- 'Density estimation using Real NVP' Dinh L Sohl-Dickstein J Bengio S. 2016. [Arxiv].
- VAE and Normalizing Flows:
- GAN and Normalizing Flows: 'Variational Inference with Normalizing Flows' Rezende D Mohamed S. 2015. [Arxiv].
Evaluation of Generative Models.
- Inception Score: 'Improved Techniques for Training GANs' Tim Salimans, Ian Goodfellow, Wojciech Zaremba, Vicki Cheung, Alec Radford, Xi Chen. 2016. [Arxiv].
- Frechet Inception Distance: 'GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium' Martin Heusel, Hubert Ramsauer, Thomas Unterthiner, Bernhard Nessler, Sepp Hochreiter. 2017. [Arxiv].
- 'A note on the evaluation of generative models' Theis L Oord A Bethge M. 2015. [Arxiv].
- 'Stanford CS236: Deep Generative Models: Evaluating Generative Models' [PDF].

Formed in 2009, the Archive Team (not to be confused with the archive.org Archive-It Team) is a rogue archivist collective dedicated to saving copies of rapidly dying or deleted websites for the sake of history and digital heritage. The group is 100% composed of volunteers and interested parties, and has expanded into a large amount of related projects for saving online and digital history.
