Skip to content

Basic Processes

Introductions

Definition of Stochastic Process

Assume \((\Omega,\mathscr{F}, \mathbb{P})\) is a probability space, \(T\) is an index set of real parameters. \(\forall t\in T\), there exists a random variable \(X_t(\omega)\) defined on \((\Omega,\mathscr{F}, \mathbb{P})\), then we call the collection of random variables \(\{X_t: t\in T\}\) Stochastic Process.

Fixing \(t\), \(X_t(\omega)\) is a general random variable. However, when fixing \(\omega\), \(X_t(\omega)\) is a function on \(T\), which is called sample path. Different from random vector, the random variables of stochastic process are defined on the same probability space.

According to the number of \(T\) and range \(S\) of \(X_t\), we have the following categories.

\(S\) denumerable \(S\) are intervals
\(T\) denumerable discrete time and state discrete time, continuous state
\(T\) are intervals continuous time, discrete state continuous time and state

Mathematical characteristics

Usually \(T\) is not countable, so we could not use all the union distribution to describe the statistic law.

Definition of finite-dimensional distribution

Assume \(\{X_t: t\in T\}\) is a stochastic process. \(\forall n\geq 1\), \(\forall t_1,t_2,\cdots,t_n\in T\), random vector \((X_{t_1},\cdots,X_{t_n})\) has its union distribution function

\[ F_{t_1\cdots t_n}(x_1,\cdots,x_n)=\mathbb{P}\{X_{t_1}<x_1,\cdots,X_{t_n}<x_n\} \]

we make all these function up to

\[ \mathscr{H}=\{F_{t_1\cdots t_n}(x_1,\cdots,x_n): t_1,t_2,\cdots,t_n\in T,\forall n\geq 1\} \]

which is called the Finite-dimensional Distribution collection of \(\{X_t: t\in T\}\).

Properties of Finite-dimensional Distribution

(i) Transverse Consistency. For any permutation of sequence \(\{1,2,\cdots,n\}\), denoted by \(\{j_1,\cdots,j_n\}\),

\[ F_{t_1\cdots t_n}(x_1,\cdots,x_n)=F_{t_{j_1}\cdots t_{j_n}}(x_{j_1},\cdots,x_{j_n}). \]

(ii) Longitudinal Consistency. \(\forall m<n \in \mathbb{N}^+\),

\[ F_{t_1\cdots t_m\cdots t_n}(x_1,\cdots,x_m,+\infty,\cdots,+\infty)=F_{t_1\cdots t_m}(x_1,\cdots,x_m). \]

Kolmogorov Theorem

Assume we have a finite-dimensional distribution collection \(\mathscr{H}\). If it satisfies Transverse and longitudinal Consistency, then there exsits a probability space \((\Omega,\mathscr{F}, \mathbb{P})\) and a stochastic process {X_t} defined on that, such that \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n\in T\),

\[ \mathbb{P}\{X_{t_1}<x_1,\cdots,X_{t_n}<x_n\}=F_{t_1\cdots t_n}(x_1,\cdots,x_n) \]

Mathematical Characteristics for one-dimensonal distribution

Assume \(X_t\) has a distribution function \(F_t(x)\). \(\forall t\in T\), define mean function

\[ \mu_X(t)=\mathbb{E}(X_t), \]

and variance function

\[ \sigma^2_X(t)=D(X_t). \]

This is similar to definition of random variables.

Mathematical Characteristics for two-dimensonal distribution

Assume \((X_{t}, X_s)\) has a distribution function \(F_{t,s}(x_t,x_s)\), \(\forall t,s\in T\), define autocorrelation function

\[ r_X(t,s)=\mathbb{E}(X_tX_s), \]

and covariance function

\[ C_X(t,s)=\text{Cov}(X_t,X_s). \]

This is similar to definitions in Random Vector.

The following theorem depicts the sufficient and necessary condition for having these mathematical characteristics exist.

Theorem for existence of two-dimensional mathematical characteristics

Stochastic process \(\{X_t\}\) has two-dimensional mathematical characteristics, iff

\[ \mathbb{E}X^2_t<\infty,\quad \forall t\in T. \]

If \(\{X_t\}\) satisfies the above condition, we call it Second-order Moment Process.

  • "\(\Rightarrow\)". Easy to see. Let \(t=s\), and using autocorrelation function
\[ R_X(t,t)=\mathbb{E}(X_t^2)<\infty. \]
  • "\(\Leftarrow\)". Using Cauchy-Schwarz Inequation. If \(\mathbb{E}X_t^2<\infty\), then
\[ |\mu_X(t)| \leq \mathbb{E}|X_t|\cdot 1 \leq \sqrt{\mathbb{E}X_t^2 \cdot \mathbb{E}1^2}<\infty, \]

and

\[ |R_X(t,s)|\leq \mathbb{E}|X_tX_s|\leq \sqrt{\mathbb{E}X_t^2 \cdot\mathbb{E}X_s^2}<\infty. \]

then use the above function to prove the existence of other items. Like \(|C_X(t,s)|=|R_X(t,s)-\mu_X(t)\mu_X(s)|<\infty\), \(|\sigma_X(t)|=|\mathbb{E}(X_t^2)-(\mathbb{E}X_t)^2|<\infty\).

Independent Process

Definition for independent process

Assume \(\{X_t\}\) is a stochastic process. If \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n\in T\) and \(x_1,\cdots,x_n\in \mathbb{R}\),

\[ \mathbb{P}(X_{t_1}<x_1,\cdots,X_{t_n}<x_n)=\prod_{i=1}^n \mathbb{P}(X_{t_i}<x_i), \]

or in terms of distribution function

\[ F_{t_1,\cdots,t_n}(x_1,\cdots,x_n)=\prod_{i=1}^n F_{t_i}(x_i), \]

then we call \(\{X_t\}\) is an independent process.

This is really hard to get an independent process in nature. So we introduce a White Noise Process.

Definition for White Noise Process

Assume \(\{X_t\}\) is a stochastic process. If it satisfies

\[ \begin{cases} \mu_X(t)=0,\quad &\forall t\in T \\ C_X(t,s)=0,\quad &\forall t\neq s\in T, \end{cases} \]

then we call \(\{X_t\}\) White Noise Process.

Note the second condition could be be interpreted as \(R_X(t,s)=\mu_X(t)^2=0, \forall t\neq s\in T\).

Independent Increment Process

Definition for Independent Increment Process

Assume \(\{X_t\}\) is a stochastic process. If \(\forall n\geq 3\), \(t_1<t_2<\cdots<t_n\in T\), random variables

\[ X_{t_2}-X_{t_1},\cdots,X_{t_n}-X_{t_{n-1}} \]

are independent mutually, then we call \(\{X_t\}\) Independent Increment Process.

Properties of independent increment process

(i) Normalization. Assume \(\{X_t: t\in [a,b]\}\) is a independent increment process, then \(Y_t:=X_t-X_a\) is still a independent increment process.

(ii) Convolution.

(i) \(\forall a\leq t_1\leq t_2\leq\cdots\leq t_n\leq b\),

\[ Y_{t_i}-Y_{t_{i-1}}=X_{t_i}-X_{t_{i-1}},\quad i=2,\cdots,n \]

are still independent mutually.

Stationary Increment Process

Stationary Increment Process

Assume \(\{X_t: t\in [a,b]\}\) is a stochastic process. If \(\forall t,s,h\in [a,b]\) with \(t+h,s+h\in [a,b]\),

\[ X_{t+h}-X_t,\quad X_{s+h}-X_s \]

are of the same distribution, then we call \(\{X_t\}\) Stationary Increment Process.

Note that the above condition also means \(\forall t>s\), \(X_t-X_s\) is merely dependent on \(t-s\), i.e h.

Usually we consider a process which is both independent and stationary increment process. It has some good properties.

Gaussian Process

Definition for Gaussian Process

Assume \(\{X_t\}\) is a stochastis process. If forall \(n\geq 1\), \(\forall t_1,\cdots,t_n\in T\), random vector \((X_{t_1},\cdots,X_{t_n})\) satisfies Gaussian distribution, then we call \(\{X_t\}\) Gaussian Process.

Note that Gaussian process is a second-order moment process, the finite-dimensional distribution is totally determined by its mean function and covariance functions.

Linear invariance of Gaussian Process

Assume \(\{X_t\}\) is a stochastis process. It is a Gaussian Process, iff \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n\in T\), \(\forall a_1,\cdots,a_n\in \mathbb{R}\), such that

\[ Y=\sum_{i=1}^na_i X_{t_i} +a_0 \]

is a normal random variable, where \(a_0,\cdots,a_n\) do not equal \(0\) at the same time.

Stationary Process

Linear index set

Index set \(T\) is a linear index set, if \(\forall t_1,t_2\in T\), \(t_1+t_2\in T\).

The above definition is to guarantee the closeness of calculation for time in index set.

Strictly Stationary Process

Assume \(\{X_t: t\in T\}\) is a stochastis process, and \(T\) is a linear index set. If \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n, h\in T\), random vector \((X_{t_1},\cdots, X_{t_n})\) and \((X_{t_1+h},\cdots,X_{t_n+h})\) share the same distribution, denoted by

\[ X_{t_i}\overset{d}{=}X_{t_i+h}, \quad \forall i=1,\cdots,n, \quad \forall h\in T \]

then we call \(\{X_t\}\) is a Strictly Stationary Process. We usually say it has translation invariance.

Properties of Strictly Stationary Process

(i) \(\{X_t\}\) is a Identically Distributed Process.

(ii) If \(\{X_t\}\) is a second-order moment process, then \(\forall t\in T\),

\[ \mu_X(t)=\mu, \quad \sigma^2_X(t)=\sigma^2, \]

and

\[ R_X(t,t+\tau)=R_X(0,t+\tau)=R(\tau). \]

(i) By definition, let \(n=1\), we have

\[ F_t(x)=F_{t+h}(x),\quad \forall t,h\in T. \]

(ii) Easy to see. Since \(X_t\overset{d}{=}X_s,(t\neq s\in T)\), their mathematical characteristics \(\mathbb{E}(X_t)=\mathbb{E}(X_s)\), then \(\mu_X(t)=\mu_X(s)=C\).

It is hard to ask a process to have its distribution invariant with time. In practice, we have the following looser process model, by emphasizing its mathematical characteristics.

Wide-sense Stationary Process

Assume \(\{X_t: t\in T\}\) is a stachostic process, and \(T\) is a linear index set. If \(\{X_t\}\) is a second-order moment process, and \(\forall t,\tau\in T\),

\[ \mu_X(t)=\mu,\quad R_X(t,t+\tau)=R(\tau). \]

then we call \(\{X_t\}\) a wide-sense Stationary process, or weakly stationary process.

Relationship of strictly and weakly stationary processes

(i) A strictly stationary process with second-order moment is a weakly stationary process.

(ii) For Gaussian process, strictly and weakly stationary processes are equivalent.

(i) Easy to see. Check the property of strictly stationary process.

(ii) For gaussian process \(\{X_t\}\), it must be a second-order moment process, so if it is a strict stationary process, then it must be a weakly stationary process.

On the other hand, if Gaussian process \(\{X_t\}\) is a weakly stationary process, then its mathematical characteristics are determined, then its finite-dimensional distribution is determined, so it is a strictly stationary process.

Possion Process