Restricted Boltzmann machines are a class of neural networks that fall under unsupervised learning techniques. Restricted Boltzmann machines (RBMs), as they are popularly known, try to learn the hidden structure of the data by projecting the input data into a hidden layer.
The hidden layer activations are expected to encode the input signal and recreate it. Restricted Boltzmann machines generally work on binary data:
Just to refresh our memory, the preceding diagram (Figure 6.6) is an RBM that has m inputs or visible units. This is projected to a hidden layer with n units. Given the visible layer inputs , the hidden units are independent of each other and hence can be sampled as follows, where represents the sigmoid function:
Similarly, given the hidden layer activations...