Skip to content

Brownian Motion

The following definition is similar to definition of Poisson Process.

Brownian Motion (Wiener Process)

Assume \(B_t\) is a real-valued Stochastic Process, it is called Brownian motion with para \(\sigma^2\) if

(i) \(B(0)=0\),

(ii) Independent increment. \(\forall n\geq 1, 0<t_1<\cdots<t_n\),

\[ B(t_1), B(t_2)-B(t_1),\cdots,B(t_n)-B(t_{n-1}) \]

are mutually independent.

(iii) Stationary increment. For \(s<t\), \(B(t)-B(s)\) and \(B(t-s)\) have the same distribution.

(iv) Normal distribution. \(\forall t\), \(B(t)\sim N(0,\sigma^2 t)\).

If \(\sigma^2=1\), we call the above process standard Brownian motion.

Properties of Brownian Process

Now we assume \(\sigma^2=1\).

(i) MC. \(\mu(B_t)=0\), \(Var(B_t)=\sigma^2t=t\).

(ii) Self-related coefficient. If \(s<t\), we have

\[ r_B(s,t)=EB_s B_t=EB_s [B(t-s)+B(s)]=s. \]

So \(r_B(s,t)=s\wedge t\). This could also be used to define a Brownian motion.

(iii) Distribution. Since \(B_t\sim N(0,t)\), we have density function

\[ p(t,x)=\frac{1}{\sqrt{2\pi t}}e^{-\frac{x^2}{2t}}. \]

In actual calculation, we usually define \(Z\sim N(0,1)\), then \(B_t=\sqrt{t} Z\).

(iv) Brownian motion is a Gaussian Process. To elaborate, \(\forall n, 0<t_1<\cdots<t_n\), group 1

\[ B(t_1), B(t_2)-B(t_1),\cdots, B(t_n)-B(t_{n-1}) \]

are mutually independent and follow Gaussian distribution, so their joint distribution is also Gaussian distribution. Since group 2

\[ B(t_1),\cdots,B(t_n) \]

are a linear combination of group 1, so the joint distribution of group 2 are also Gaussian distribution. And moreover, \((B(t_1),\cdots,B(t_n))\sim N(\pmb{0}, \pmb{\Sigma}_n)\), where

\[ \pmb{\Sigma}_n=(t_i\wedge t_j)_{n\times n}. \]

Test Criterion

Equivalent Proposition for Brownian Motion

Assume \(X_t\) is a real-valued Gaussian process, if \(EX_t=0\), \(r_X(s,t)=s\wedge t\), then \(X\) is a Brownian motion.

  • \(X_0=0\). This is by \(X_0\sim N(0,0)\) which is a decreased Gaussian distribution, so \(X_0=0\) a.s.

  • Independent increment. \(\forall n\), \(0<t_1<\cdots<t_n\), we have \(X(t_1), X(t_2)-X(t_1),\cdots, X(t_n)-X(t_{n-1})\) are mutually independent because their joint distribution is Gaussian distribution and covariance is zero for any two of them.

  • Stationary increment. For \(s<t\), we have \(X(t)-X(s)\sim N(0, t-s)\sim X(t-s)\), Readers could calculate its variance.

  • Normal distribution. \(X_t\sim N(0,t)\).

Corollary: some transformation of Brownian motion

Assume \(B_t\) is a Brownian motion, then the following transformation is also Brownian motion.

(i) Given \(t_0>0\), a motion starting at \(t_0\)

\[ X_t=B(t+t_0)-B(t_0). \]

(ii) Self-Similarity. Given a constant \(c>0\),

\[ X_t=\frac{1}{c}B(c^2t). \]

(iii) Symmetry of \(0\) and \(\infty\).

\[ X_t=\tilde{B}_t=\begin{cases} tB(1/t),\quad &t>0,\\ 0,\quad &t=0. \end{cases} \]

Just use the above test criterion. For (i) it is actually a translation.

For (ii), it is a scaling transformation.

For (iii), we first prove it is Gaussian Process. For all \(n\), \(0<t_1<\cdots<t_n\), since joint distribution of

\[ B(1/t_1),B(1/t_2),\cdots, B(1/t_n) \]

is Gaussian distribution by assumption, so their linear combination

\[ t_1 B(1/t_1), t_2B(1/t_2)-t_1B(1/t_1),\cdots , t_nB(1/t_n)-t_{n-1}B(1/t_{n-1}) \]

is Gaussian distribution. Easy to check its MC. That is, \(EX_t=EtB(1/t)=0\),

\[ \begin{align*} r_X(s,t)=EX_tX_s&=stEB(1/t)B(1/s)\\ &=stE\{B(1/t)[B(1/s)-B(1/t)+B(1/t)]\}\\ &=st EB(1/t)[B(1/s)-B(1/t)]+EB(1/t)^2\\ &=st \cdot \frac{1}{t}=s \end{align*} \]

that is, \(r_X(s,t)=s\wedge t\).

Here we give some Related Processes derived from standard Brownian motion.

Related Processes

Assume \(B_t\) is a standard Brownian motion, then the following processes are common.

(i) Brownian Bridge.

\[ B^0(t)=B(t)-B(1)t, \quad 0\leq t\leq 1. \]

(ii) Reflected Brownian motion.

\[ X_t=|B_t|,\quad t\geq 0. \]

(iii) Geometric Brownian motion. Gien \(\alpha, \beta\in \mathbb{R}\),

\[ X_t=e^{\alpha t +\beta B_t},\quad t\geq 0. \]

(iv) Integrated process.

We give the distribution of the above process.

(i) \(X_t\) is still a Gaussian process. \(EX_t=0-t\cdot 0=0\), and for \(0<s<t<1\)

\[ \begin{align*} r_X(s,t)&=E[B(s)-B(1)s][B(t)-B(1)t]\\ &=EB_sB_t-sEB(1)B(t)-tEB(1)B(s)+stEB_1^2\\ &=s-st-st+st=s-st \end{align*} \]

so \(r_X(s,t)=(s\wedge t)(1-s\lor t)\).

(ii) For this one, we could get its distribution

\[ F_{X}(x)=\begin{cases} 0,\quad &x\leq 0\\ P(|B_t|\leq x)\quad &x>0. \end{cases} \]

while for \(x>0\),

\[ P(|B_t|\leq x)=P(-x\leq B_t\leq x)=2\Phi(x/\sqrt{t})-1. \]

with its density function (by taking derivative of the above CDF)

\[ p(t,x)=\begin{cases} 0,\quad &x\leq 0\\ \sqrt{\frac{2}{\pi t}}e^{-\frac{x^2}{2t}},\quad &x>0. \end{cases} \]

So its ME

\[ EX_t=\int_\mathbb{R} |x|\frac{1}{\sqrt{2\pi t}}e^{-\frac{x^2}{2t}}dx=\frac{1}{2t}\int_0^\infty \frac{1}{\sqrt{2\pi t}}e^{-\frac{x^2}{2t}}d\frac{x^2}{2t}=\sqrt{2t/\pi}. \]

(iii) Distribution is a little tedious. But for ME, we have \(Z\sim N(0,1)\)

\[ Ee^{\alpha t+\beta Z}=e^{\alpha t}\int_\mathbb{R} e^{\beta x}\frac{1}{\sqrt{2\pi}}e^{-\frac{x^2}{2}}dx=e^{\alpha t+\beta^2/2} \]

So \(Ee^{\alpha t+\beta B_t}=Ee^{\alpha t+\beta \sqrt{t}Z}=Ee^{\alpha t+(\beta\sqrt{t}) Z}=e^{\alpha t+\beta^2t/2}\).

Maximum & First hitting distribution