Effect of the number of neurons in an RBM layer in a DBN
Let's look at how changing the number of neurons in an RBM layer affects the test set's accuracy:
An RBM layer with 512 neurons
The following is the output of a DBN with 512 neurons in an RBM layer. The reconstruction loss has come down and the test set's accuracy has come down as well:
Reconstruction loss: 0.128517: 100%|██████████| 5/5 [01:32<00:00, 19.25s/it] Start deep belief net finetuning... Tensorboard logs dir for this run is /home/ubuntu/.yadlt/logs/run55 Accuracy: 0.0758: 100%|██████████| 1/1 [00:06<00:00, 6.40s/it] Test set accuracy: 0.0689999982715
Notice how the accuracy and test set accuracy both have come down. This means increasing the number of neurons doesn't necessarily improve the accuracy.
An RBM layer with 128 neurons
A 128-neuron RBM layer leads to higher test set accuracy but a lower overall accuracy:
Reconstruction loss: 0.180337: 100%|██████████| 5/5 [00:32<00:00, 6.44s/it] Start deep belief...