Further reading
For a primer on neural networks, it makes sense to read from a range of sources. There are many concerns to be aware of and different authors emphasize on different material. A solid introduction is provided by Kevin Gurney in An Introduction to Neural Networks.
An excellent piece on the intuitions underlying Markov Chain Monte Carlo is available at http://twiecki.github.io/blog/2015/11/10/mcmc-sampling/.
For readers with a specific interest in the intuitions supporting Gibbs Sampling, Philip Resnik, and Eric Hardisty's paper, Gibbs Sampling for the Uninitiated, provides a technical, but clear description of how Gibbs works. It's particularly notable to have some really first-rate analogies! Find them at https://www.umiacs.umd.edu/~resnik/pubs/LAMP-TR-153.pdf.
There aren't many good explanations of Contrastive Divergence, one I like is provided by Oliver Woodford at http://www.robots.ox.ac.uk/~ojw/files/NotesOnCD.pdf. If you're a little daunted by the heavy...