- When do they exist?
- When are they unique?
See Definition (Distribution Vector).
Theorem (Distribution of a Markov Chain)
Let be a Markov Chain. The distribution of at time is given by
where is the distribution vector at time . is the identity matrix.
Proof: Use induction.
Definition (Stationary or Invariant Distribution)
Let be the transition matrix for the Markov Chain with . A probability vector is a stationary or invariant distribution for (or for ) if
Remark (Conclusions from Examples 1,2)
The examples 1 and 2 show that asymptotic distribution of is the invariant distribution.
Definition (Reducible)
If two states are reachable from one another, then we say that can communicate with one another. This communicability forms an equivalence class (proof omitted).
A Markov Chain is reducible if there are two or more communication classes.
Definition (Periodic)
A Markov Chain is periodic if there is a state that is returned at regular intervals greater than step.
Theorem (Uniqueness of Invariant Distributions)
An invariant distribution for a Markov Chain unique unless the MC is reducible or periodic.
Definition (Regular Transition Matrices)
A finite state time-homogeneous MC with transition matrix is regular if there exists an integer such that (i.e. all entries of are strictly positive).
Example 1 (Simple Regularity)
Consider
This MC is regular. Immediately, we see that if , the entries are positive.
Theorem (Regular MC Gives Unique Stationary Distributions)
Let be a regular state Markov Chain with transition matrix . Then:
- The chain has a unique stationary distribution with .
- The probabilities converge: i.e. where represents the row of .
- For any initial distribution , Or, for all states .
Proof:
Let be the limit of the state distribution over time, such that . Since via Theorem (Distribution of a Markov Chain), then . We apply the limit and get
and that is a fixed point for . This shows exists and is stationary. By , since every row of matrix converges to the same , then is unique, thus . Then because the row-vector has its entries sum to .
So,
Consider the first component of the resulting vector:
We can factor out the stationary probability for each component and get
which is precisely equal to . Thus it is independent.
Solving For Stationary Distributions
How do we find ? We can either:
- Solve or
- Diagonalize .
So,
The last line is from the fact that the stationary matrix is from its eigenvalues. Also recall that diagonalization of a matrix gives , it’s matrix of eigenvectors and , a Diagonal matrix of eigenvalues.
Note that collapses to
because one eigenvalue of a stochastic matrix will be . The rest are , and so as .
Remark (Stationaries Have Equal Out/Inflow)
For a distribution to be stationary, the total outflow from a state must be exactly equal to the total inflow into that same state.
Definition (Detailed Balance)
A stochastic matrix and a probability vector satisfy detailed balance if
for all .
Proposition (Detailed Balance is Stationary)
If a stochastic matrix and a probability vector satisfy detailed balance then , i.e. is stationary with respect to .
Proof:
for row .
Fundamental Theorem of Markov Chains
If a Markov Chain is a irreducible and aperiodic and has a stationary distribution , then
- is the unique stationary distribution
- The distribution of the state at time converges to as regardless of the initial distribution , or that
Remark on Regularity: This theorem is the functional equivalent of the Regular MC Theorem. In finite state spaces, the conditions (irreducible + aperiodic) are exactly what define a regular transition matrix ( for some ).
Proof: See Theorem (Regular MC Gives Unique Stationary Distributions).