Conditional VAE (CVAE)
Conditional VAE [2] is similar to the idea of CGAN. In the context of the MNIST dataset, if the latent space is randomly sampled, VAE has no control over which digit will be generated. CVAE is able to address this problem by including a condition (a one-hot label) of the digit to produce. The condition is imposed on both the encoder and decoder inputs.
Formally, the core equation of VAE in Equation 8.1.10 is modified to include the condition c:
(Equation 8.2.1)
Similar to VAEs, Equation 8.2.1 means that if we want to maximize the output conditioned on c,
, then the two loss terms must be minimized:
- Reconstruction loss of the decoder given both the latent vector and the condition.
- KL loss between the encoder given both the latent vector and the condition and the prior distribution given the condition. Similar to a VAE, we typically choose
.
Listing 8.2.1, cvae-cnn-mnist-8.2.1.py
shows us the Keras code of CVAE using CNN layers. In the code that is highlighted...