Basic Processes¶
Introductions¶
Definition of Stochastic Process
Assume \((\Omega,\mathscr{F}, \mathbb{P})\) is a probability space, \(T\) is an index set of real parameters. \(\forall t\in T\), there exists a random variable \(X_t(\omega)\) defined on \((\Omega,\mathscr{F}, \mathbb{P})\), then we call the collection of random variables \(\{X_t: t\in T\}\) Stochastic Process.
Fixing \(t\), \(X_t(\omega)\) is a general random variable. However, when fixing \(\omega\), \(X_t(\omega)\) is a function on \(T\), which is called sample path. Different from random vector, the random variables of stochastic process are defined on the same probability space.
According to the number of \(T\) and range \(S\) of \(X_t\), we have the following categories.
\(S\) denumerable | \(S\) are intervals | |
---|---|---|
\(T\) denumerable | discrete time and state | discrete time, continuous state |
\(T\) are intervals | continuous time, discrete state | continuous time and state |
Mathematical characteristics¶
Usually \(T\) is not countable, so we could not use all the union distribution to describe the statistic law.
Definition of finite-dimensional distribution
Assume \(\{X_t: t\in T\}\) is a stochastic process. \(\forall n\geq 1\), \(\forall t_1,t_2,\cdots,t_n\in T\), random vector \((X_{t_1},\cdots,X_{t_n})\) has its union distribution function
we make all these function up to
which is called the Finite-dimensional Distribution collection of \(\{X_t: t\in T\}\).
Properties of Finite-dimensional Distribution
(i) Transverse Consistency. For any permutation of sequence \(\{1,2,\cdots,n\}\), denoted by \(\{j_1,\cdots,j_n\}\),
(ii) Longitudinal Consistency. \(\forall m<n \in \mathbb{N}^+\),
Kolmogorov Theorem
Assume we have a finite-dimensional distribution collection \(\mathscr{H}\). If it satisfies Transverse and longitudinal Consistency, then there exsits a probability space \((\Omega,\mathscr{F}, \mathbb{P})\) and a stochastic process {X_t} defined on that, such that \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n\in T\),
Mathematical Characteristics for one-dimensonal distribution
Assume \(X_t\) has a distribution function \(F_t(x)\). \(\forall t\in T\), define mean function
and variance function
This is similar to definition of random variables.
Mathematical Characteristics for two-dimensonal distribution
Assume \((X_{t}, X_s)\) has a distribution function \(F_{t,s}(x_t,x_s)\), \(\forall t,s\in T\), define autocorrelation function
and covariance function
This is similar to definitions in Random Vector.
The following theorem depicts the sufficient and necessary condition for having these mathematical characteristics exist.
Theorem for existence of two-dimensional mathematical characteristics
Stochastic process \(\{X_t\}\) has two-dimensional mathematical characteristics, iff
If \(\{X_t\}\) satisfies the above condition, we call it Second-order Moment Process.
- "\(\Rightarrow\)". Easy to see. Let \(t=s\), and using autocorrelation function
- "\(\Leftarrow\)". Using Cauchy-Schwarz Inequation. If \(\mathbb{E}X_t^2<\infty\), then
and
then use the above function to prove the existence of other items. Like \(|C_X(t,s)|=|R_X(t,s)-\mu_X(t)\mu_X(s)|<\infty\), \(|\sigma_X(t)|=|\mathbb{E}(X_t^2)-(\mathbb{E}X_t)^2|<\infty\).
Independent Process¶
Definition for independent process
Assume \(\{X_t\}\) is a stochastic process. If \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n\in T\) and \(x_1,\cdots,x_n\in \mathbb{R}\),
or in terms of distribution function
then we call \(\{X_t\}\) is an independent process.
This is really hard to get an independent process in nature. So we introduce a White Noise Process.
Definition for White Noise Process
Assume \(\{X_t\}\) is a stochastic process. If it satisfies
then we call \(\{X_t\}\) White Noise Process.
Note the second condition could be be interpreted as \(R_X(t,s)=\mu_X(t)^2=0, \forall t\neq s\in T\).
Independent Increment Process¶
Definition for Independent Increment Process
Assume \(\{X_t\}\) is a stochastic process. If \(\forall n\geq 3\), \(t_1<t_2<\cdots<t_n\in T\), random variables
are independent mutually, then we call \(\{X_t\}\) Independent Increment Process.
Properties of independent increment process
(i) Normalization. Assume \(\{X_t: t\in [a,b]\}\) is a independent increment process, then \(Y_t:=X_t-X_a\) is still a independent increment process.
(ii) Convolution.
(i) \(\forall a\leq t_1\leq t_2\leq\cdots\leq t_n\leq b\),
are still independent mutually.
Stationary Increment Process¶
Stationary Increment Process
Assume \(\{X_t: t\in [a,b]\}\) is a stochastic process. If \(\forall t,s,h\in [a,b]\) with \(t+h,s+h\in [a,b]\),
are of the same distribution, then we call \(\{X_t\}\) Stationary Increment Process.
Note that the above condition also means \(\forall t>s\), \(X_t-X_s\) is merely dependent on \(t-s\), i.e h.
Usually we consider a process which is both independent and stationary increment process. It has some good properties.
Gaussian Process¶
Definition for Gaussian Process
Assume \(\{X_t\}\) is a stochastis process. If forall \(n\geq 1\), \(\forall t_1,\cdots,t_n\in T\), random vector \((X_{t_1},\cdots,X_{t_n})\) satisfies Gaussian distribution, then we call \(\{X_t\}\) Gaussian Process.
Note that Gaussian process is a second-order moment process, the finite-dimensional distribution is totally determined by its mean function and covariance functions.
Linear invariance of Gaussian Process
Assume \(\{X_t\}\) is a stochastis process. It is a Gaussian Process, iff \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n\in T\), \(\forall a_1,\cdots,a_n\in \mathbb{R}\), such that
is a normal random variable, where \(a_0,\cdots,a_n\) do not equal \(0\) at the same time.
Stationary Process¶
Linear index set
Index set \(T\) is a linear index set, if \(\forall t_1,t_2\in T\), \(t_1+t_2\in T\).
The above definition is to guarantee the closeness of calculation for time in index set.
Strictly Stationary Process
Assume \(\{X_t: t\in T\}\) is a stochastis process, and \(T\) is a linear index set. If \(\forall n\geq 1\), \(\forall t_1,\cdots,t_n, h\in T\), random vector \((X_{t_1},\cdots, X_{t_n})\) and \((X_{t_1+h},\cdots,X_{t_n+h})\) share the same distribution, denoted by
then we call \(\{X_t\}\) is a Strictly Stationary Process. We usually say it has translation invariance.
Properties of Strictly Stationary Process
(i) \(\{X_t\}\) is a Identically Distributed Process.
(ii) If \(\{X_t\}\) is a second-order moment process, then \(\forall t\in T\),
and
(i) By definition, let \(n=1\), we have
(ii) Easy to see. Since \(X_t\overset{d}{=}X_s,(t\neq s\in T)\), their mathematical characteristics \(\mathbb{E}(X_t)=\mathbb{E}(X_s)\), then \(\mu_X(t)=\mu_X(s)=C\).
It is hard to ask a process to have its distribution invariant with time. In practice, we have the following looser process model, by emphasizing its mathematical characteristics.
Wide-sense Stationary Process
Assume \(\{X_t: t\in T\}\) is a stachostic process, and \(T\) is a linear index set. If \(\{X_t\}\) is a second-order moment process, and \(\forall t,\tau\in T\),
then we call \(\{X_t\}\) a wide-sense Stationary process, or weakly stationary process.
Relationship of strictly and weakly stationary processes
(i) A strictly stationary process with second-order moment is a weakly stationary process.
(ii) For Gaussian process, strictly and weakly stationary processes are equivalent.
(i) Easy to see. Check the property of strictly stationary process.
(ii) For gaussian process \(\{X_t\}\), it must be a second-order moment process, so if it is a strict stationary process, then it must be a weakly stationary process.
On the other hand, if Gaussian process \(\{X_t\}\) is a weakly stationary process, then its mathematical characteristics are determined, then its finite-dimensional distribution is determined, so it is a strictly stationary process.