DBNs with two RBM layers
In this section, we will create a DBN with two RBM layers and run it on the MNIST dataset. We will modify the input parameters for the DeepBeliefNetwork(..)
class:
name = 'dbn' rbm_layers = [256, 256] finetune_act_func ='relu' do_pretrain = True rbm_learning_rate = [0.001, 0.001] rbm_num_epochs = [5, 5] rbm_gibbs_k= [1, 1] rbm_stddev= 0.1 rbm_gauss_visible= False momentum= 0.5 rbm_batch_size= [32, 32] finetune_learning_rate = 0.01 finetune_num_epochs = 1 finetune_batch_size = 32 finetune_opt = 'momentum' finetune_loss_func = 'softmax_cross_entropy' finetune_dropout = 1 finetune_act_func = tf.nn.sigmoid
Notice that some of the parameters have two elements for array so we need to specify these parameters for two layers:
rbm_layers = [256, 256]
: Number of neurons in each RBM layerrbm_learning_rate = [0.001, 0001]
: Learning rate for each RBM layerrbm_num_epochs = [5, 5]
: Number of epochs in each layerrbm_batch_size= [32, 32]
: Batch size for each RBM layer
Let's look at the...