Markov Process & chain¶
Definition of Markov Process
Assume \(\{X(t): t\in T\}\) is a stochastic process. If \(\forall n\), \(t_i\in T\), \(i=1,2,\cdots,n\), and \(t_1<\cdots<t_n\), we have state \(x_1,\cdots,x_n\in S\), the conditional probability \(X(t_n)\) satisfies
then \(X(t)\) is called Markov Process.
When the state space \(S\) is discrete, then we call the above process Markov Chain.
When both the state space \(S\) and time space \(T\) are discrete, then the above process is called discrete Markov Chain.
Transition Probability Distribution
We call the following conditional distribution
trransition probability distribution.
For discrete Markov chain, \(\forall n,m \in T\), \(i,j \in S\), we define transition probability
which means the probability of transit from state \(i\) at time \(m\) to state \(j\) at time \(n\). Apparently, we have \(p_{ij}(m,n)\geq 0\) and
Define transition matrix of \(n\) steps
Specifically, we focus on transition matrix of one step. If \(P(m,m+1)\) is irrelevant with \(m\), then we call the above Markov chain time-homogeneous. So we have \(p_{ij}=p_{ij}(m,m+1)\).