Markov Chain Monte Carlo (MCMC)
As we have seen in The Markov property section of Chapter 7, Sequential Data Models, the state or prediction in a sequence is a function of the previous state(s). In the first order, Markov processes the probability of a state at time t depending on the probability of the state at time t-1.
The concept of a Markov chain can be extended with the traditional Monte Carlo sampling to model distributions with a large number of variables (high dimension) or parametric distributions.
Overview
The idea behind the Markov Chain Monte Carlo inference or sampling is to randomly walk along the chain from a given state and successively select (randomly) the next state from the state-transition probability matrix (The Hidden Markov Model/Notation in Chapter 7, Sequential Data Models) [8:6].
This iterative process explores the distribution from the transition probability matrix if it matches the target distribution also known as the proposal distribution. At each iteration, MCMC...