Restricted Boltzmann Machines: generating pixels with statistical mechanics
The neural network model that we will apply to the MNIST data has its origins in earlier research on how neurons in the mammalian brain might work together to transmit signals and encode patterns as memories. By using analogies to statistical mechanics in physics, this section will show you how simple networks can "learn" the distribution of image data and be used as building blocks for larger networks.
Hopfield networks and energy equations for neural networks
As we discussed in Chapter 3, Building Blocks of Deep Neural Networks, Hebbian Learning states, "Neurons that fire together, wire together",8 and many models, including the multi-layer perceptron, made use of this idea in order to develop learning rules. One of these models was the Hopfield network, developed in the 1970-80s by several researchers9 10. In this network, each "neuron" is connected to every other by...