The steady state vector determination
A
Markov chain is a system that has at least two states. For detailed information on Markov chains, please refer to http://en.wikipedia.org/wiki/Markov_chain.The state at time t depends on the state at time t-1, and only the state at t-1. The system switches at random between these states. I would like to define a Markov chain for a stock. Let's say that we have the states flat F
, up U
, and down D
. We can determine the steady state based on end of day close prices.
Far into the distant future or in theory infinite time, the state of our Markov chain system will not change anymore. This is also called a steady state (http://en.wikipedia.org/wiki/Steady_state). The
stochastic matrix (http://en.wikipedia.org/wiki/Stochastic_matrix) A
, which contains the state transition probabilities, and when applied to the steady state, will yield the same state x
. The mathematical notation for this will be as follows:
Another way to look at this is as the eigenvector...