Introducing Markov chains
Markov chains are discrete dynamic systems that exhibit characteristics attributable to Markovian processes. These are finite state systems – finite Markov chains – in which the transition from one state to another occurs on a probabilistic, rather than deterministic, basis. The information available about a chain at the generic instant t is provided by the probabilities that it are in any of the states, and the temporal evolution of the chain is specified by specifying how these probabilities update by going from the instant t at instant t + 1.
Important Note
A Markov chain is a stochastic model in which the system evolves over time in such a way that the past affects the future only through the present: Markov chains have no memory of the past.
A random process characterized by a sequence of random variables X = X0, ..., Xn with values in a set j0, j1, ..., jn is given. This process is Markovian if the evolution of the process depends...