Improved GANs
Since the introduction of Generative Adversarial Networks (GANs) in 2014[1], their popularity has rapidly increased. GANs have proven to be a useful generative model that can synthesize new data that looks real. Many of the research papers in deep learning that followed proposed measures to address the difficulties and limitations of the original GAN.
As we discussed in previous chapters, GANs can be notoriously difficult to train, and are prone to mode collapse. Mode collapse is a situation where the generator is producing outputs that look the same even though the loss functions are already optimized. In the context of MNIST digits, with mode collapse, the generator may only be producing digits 4 and 9 since they look similar. The Wasserstein GAN (WGAN)[2] addressed these problems by arguing that stable training and mode collapse can be avoided by simply replacing the GAN loss function based on Wasserstein 1, also known as the Earth Mover's Distance...