Markov Chain Monte Carlo
MCMC is a method of random sampling from a target population/distribution defined by high-dimensional probability definition. It is a large-scale statistical method that draws samples randomly from a complex probabilistic space to approximate the distribution of attributes over a range of future states. It helps gauge the distribution of a future outcome and the sample averages help approximate expectations. A Markov chain is a graph of states over which a sampling algorithm takes a random walk.
The most known MCMC algorithm is perhaps Gibbs sampling. The algorithms are nothing but different methodologies for constructing the Markov chain. The most general MCMC algorithm is Metropolis-Hastings and has flexibility in many ways. These two algorithms will be discussed in the next subsections.
Gibbs sampling algorithm
In Gibbs sampling, the probability of the next sample in the Markov chain is calculated as the conditional probability of the prior sample...