Definition (Markov Process)
A Stochastic Process is a set is a Markov Process if
. In particular, it means if th step only depends on the previous step.
A Markov Process is time homogeneous if
. So the probability of going from state to does not depend on the time.
Definition (Transition Probability)
The transition probability of jumping between states is
for all . This allows us to put these relationships into a matrix.
for . Here, are states with row/column labels.
Definition (Joint Distribution for Markov Processes)
The Joint Distribution is for any we have
by the chain rule, Markov Property, and time homogeneity in that order. Visually, this is a path through a graph. The probability of this path is the product of the edges crossed.
Example 1 (Transition Matrix)
Let be iid. Then is a time homogeneous Markov Chain. Let . Then . This means our transition matrix is
The sum of a column must be equal to . This corresponds to every edge out of a node summing to .
Example 3 (1D Random Walk)
Consider a random walk on a number line from . Then
for and at for . We jump right at probability and left with probability . We can model this with a Markov Chain where jumping from is probability and as .
Definition (Stochastic Matrix)
A square matrix is a stochastic matrix if all and for all . All rows sum to . For a discrete state time homogeneous Markov Chain,
with transition matrix , the step transition probabilities are
and -step transition matrix is
e.g.
- where is the Kronecker Delta.
Definition (Distribution Vector)
The distribution vector for is
where for .
Proposition (Stochastic Properties)
Let for be a time homogeneous Markov Chain with transition matrix . Then
- is a Stochastic Matrix
- is an Eigenvalue of with eigenvector . In particular,
- (Chapman-Kolmgorov Equality)
- and for all .