Simulating a discrete-time Markov chain
Discrete-time Markov chains are stochastic processes that undergo transitions from one state to another in a state space. Transitions occur at every time step. Markov chains are characterized by their lack of memory in that the probability to undergo a transition from the current state to the next depends only on the current state, not the previous ones. These models are widely used in scientific and engineering applications.
Continuous-time Markov processes also exist and we will cover particular instances later in this chapter.
Markov chains are relatively easy to study mathematically and to simulate numerically. In this recipe, we will simulate a simple Markov chain modeling the evolution of a population.
How to do it...
- Let's import NumPy and Matplotlib:
>>> import numpy as np import matplotlib.pyplot as plt %matplotlib inline
- We consider a population that cannot comprise more than individuals, and define the birth and death...