Next, we will start building the encoding module of our VAE. This part is almost identical to the shallow encoder we built in the last chapter, except that it splits into two separate layers: one estimating the mean and the other estimating the variance over the latent space:
#Encoder module
input_layer= Input(shape=(original_dim,))
intermediate_layer= Dense(intermediate_dim, activation='relu', name='Intermediate layer')(input_layer)
z_mean=Dense(latent_dim, name='z-mean')(intermediate_layer)
z_log_var=Dense(latent_dim, name='z_log_var')(intermediate_layer)
You could optionally add the name argument while defining a layer, to be able to visualize our model intuitively. If we want, we can actually visualize the network we have built so far, by initializing it already and summarizing it, as shown here:
Note...