Skip to content

Markov Process & chain

Definition of Markov Process

Assume \(\{X(t): t\in T\}\) is a stochastic process. If \(\forall n\), \(t_i\in T\), \(i=1,2,\cdots,n\), and \(t_1<\cdots<t_n\), we have state \(x_1,\cdots,x_n\in S\), the conditional probability \(X(t_n)\) satisfies

\[ \begin{align*} P&(X(t_n)<x_n\mid X(t_0)=x_0,\cdots X(t_{n-1})=x_{n-1})\\&=P(X(t_n)<x_n\mid X(t_{n-1})=x_{n-1}),\quad x_n\in \mathbb{R}, \end{align*} \]

then \(X(t)\) is called Markov Process.

When the state space \(S\) is discrete, then we call the above process Markov Chain.

When both the state space \(S\) and time space \(T\) are discrete, then the above process is called discrete Markov Chain.

Transition Probability Distribution

We call the following conditional distribution

\[ F(t_{n-1}, x_{n-1}; t_n, x_n)=P(X(t_n)<x_n\mid X(t_{n-1})=x_{n-1}). \]

trransition probability distribution.

For discrete Markov chain, \(\forall n,m \in T\), \(i,j \in S\), we define transition probability

\[ p_{ij}(m,n)=P(X(n)=j\mid X(m)=i) \]

which means the probability of transit from state \(i\) at time \(m\) to state \(j\) at time \(n\). Apparently, we have \(p_{ij}(m,n)\geq 0\) and

\[ \sum_{j=1}^\infty p_{ij}(m,n)=1. \]

Define transition matrix of \(n\) steps

\[ P(m,m+n)= [p_{ij}(m, m+n)]_{|S|\times |S|} \]

Specifically, we focus on transition matrix of one step. If \(P(m,m+1)\) is irrelevant with \(m\), then we call the above Markov chain time-homogeneous. So we have \(p_{ij}=p_{ij}(m,m+1)\).