GAN – code example
In the following example, we build and train a GAN model using an MNIST dataset and using TensorFlow. Here, we will use a special version of the ReLU activation function known as Leaky ReLU. The output is a new type of handwritten digit:
Note
Leaky ReLU is a variation of the ReLU activation function given by the formula f(x) = max(α∗x,x). So the output for the negative value for x is alpha * x and the output for positive x is x.
#import all necessary libraries and load data set
%matplotlib inline
import pickle as pkl
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets('MNIST_data')
In order to build this network, we need two inputs, one for the generator and one for the discriminator. In the following code, we create placeholders for real_input
for the discriminator and z_input
for the generator, with the input sizes as dim_real
and dim_z
, respectively:
#place...