A full listing of our articles covers the applications of GAN, the issues, and the solutions.
GAN — A comprehensive review into the gangsters of GANs (Part 1)
Are we there yet? In this GAN series, we examine how GAN is applied to deep learning problems and look into its…
GAN — A comprehensive review into the gangsters of GANs (Part 2)
This article studies the motivation and the direction of the GAN research in improving GANs. By reviewing them in a…
GAN — What’s GAN Generative Adversary Networks?
To create something from nothing is one of the greatest feelings, … It’s heaven.
GAN — Some cool applications of GANs.
In this section, we will sample some of the GAN’s applications and display some of their results.
GAN — CycleGAN
Let’s say we collect three set of pictures: one for real pictures, one for Monet paintings and the last one for Van…
GAN — Super Resolution GAN (SRGAN)
Super-resolution GAN applies a deep network in combination with an adversary network to produce higher resolution…
GAN — Why it is so hard to train Generative Advisory Networks!
It is easier to recognize a Monet’s painting than drawing one. Generative models (creating data) are considered much…
General GAN Improvements
GAN — Ways to improve GAN performance
GAN models can suffer badly in the following areas comparing to other deep networks.
Improving the Network design
GAN — DCGAN (deep convolutional generative adversarial networks)
Replace all maxpooling with convolutional stride.
GAN — CGAN & InfoGAN (using labels to improve GAN)
In discriminative models, like classification, we may handcraft features to make models to perform better. This…
GAN — Stacked Generative Adversarial Networks (SGAN)
Stacked Generative Adversarial Networks (SGAN) composes of
GAN — Progressive growing of GANs
The progressive growing of GANs trains the GAN network in multiple phases. In the first phase, it takes in a latent…
GAN — Spectral Normalization
GAN is vulnerable to mode collapse and training instability. In this article, we look into WGAN and Spectral…
GAN — StyleGAN & StyleGAN2
Do you know your style? Most GAN models don’t. In the vanilla GAN, we generate an image from a latent factor z.
Improving the cost function
GAN — LSGAN (How to be a good helper?)
The GAN discriminator distinguishes real images from the generated one. It usually performs reasonably well. But as a…
GAN — Wasserstein GAN & WGAN-GP
Training GAN is known to be hard. We regularly deal with the performance, model stability, model convergence and mode…
GAN — Energy based GAN (EBGAN) & Boundary Equilibrium GAN (BEGAN)
GAN is a minimax game in which opponents take action to undercut the other. This makes GAN much harder to converge, if…
GAN — RSGAN & RaGAN (A new generation of cost function.)
How can we train GAN faster with better image quality? We have many new cost functions already. But Relativistic GAN…
GAN — What is wrong with the GAN cost function?
We work hard to produce mathematical models for deep learning. But often, we are not successful and fall back to the…
GAN- Does LSGAN, WGAN, WGAN-GP or BEGAN matter?
A 2017 Google Brain paper “Are GANs Created Equal?” claims that
Proof (GAN optimal point)
We have mentioned optimizing GAN is optimizing JS-divergence. This is not obvious from the cost function:
GAN — DRAGAN
Mode collapse is one major concern in training GAN. In the figure below, we use an BEGAN to generate faces. For each…