Skip to content

Poisson Process

Definition of Poisson Process

Assume Stachostic process \(\{N_t: t\geq 0\}\) has state space \(S=\mathbb{N}^+\). If it satisfies

(i) \(N(0)=0\).

(ii) independent increment (additive).

(iii) \(\forall s,t\geq 0\), \(N(s+t)-N(s) \sim \pi(\lambda t),\lambda>0\),

then we call \(\{N_t\}\) is a Poisson Process.

Equivalent Definition of Poisson Process

Assume Stachostic process \(\{N_t: t\geq 0\}\) is a Poisson process, iff it satisfies

(i) \(N(0)=0\).

(ii) independent increment (additive).

(iii) \(\mathbb{P}(N(h+t)-N(t)=1)=\lambda h + o(h),\quad h>0\).

(iv) \(\mathbb{P}(N(h+t)-N(t)\geq 2)=o(h), \quad h>0\).

Easy to see. Use \(e^{-\lambda h}=1-\lambda h+o(h)\), we have

\[ \mathbb{P}(N(h+t)-N(t)=1)=e^{-\lambda h} \frac{(\lambda h)^1}{1}=\lambda h (1-\lambda h + o(h))=\lambda h +o(h), \]

and

\[ \begin{align*} \mathbb{P}(N(h+t)-N(t)\geq 1)&= 1-\mathbb{P}(N(h+t)-N(t)\leq 1)\\ &=1-e^{-\lambda h}(1+\lambda h)\\ &=1-(1-\lambda h + o(h))(1+\lambda h)=o(h). \end{align*} \]

It is a little hard to prove this direction. For we have to show that \(N(t+h)-N(t)\) is a Poisson distribution.

Notice the following theorem for continuous random variables.

Probability at a point equals zero

\[ \begin{align*} P(N_{(t,t+h]}=0)&=P(N_{[0,t+h]}-N_{[0,t]}=0)\\ &=1-\lambda h +o(h)\rightarrow 1 (h\rightarrow 0). \end{align*} \]

which means \(P(N_{[0,t)}=N_{[0,t]})=1\).

Later, we focus on the number on intervals without considering carefully about the boundary.

Mathematical Characteristic

Here we use generating function to deduce its mathematical characteristics. Recall that for random variable \(\xi\) which follows Poisson distribution with para \(\lambda\), we have

\[ \psi_\xi(s)=e^{\lambda (s-1)}. \]

Note that for ME, we have

\[ \begin{align*} E(s^\xi)&=\psi_\xi(s)=e^{\lambda (s-1)},\\ E(\xi s^{\xi-1})&=\psi'_\xi(s)=\lambda e^{\lambda (s-1)}\\ E[\xi (\xi-1) s^{\xi-2}]&=\psi''_\xi(s)=\lambda^2 e^{\lambda (s-1)}\\ \end{align*} \]

Using Levy Theorem, we let \(s\nearrow 1\), then \(E(\xi)=\lambda\), \(E(\xi^2-\xi)=\lambda^2\), so \(D(\xi)=\lambda\). As for Poisson process, we have

Mathematical characteristic of Poisson Process

(i) \(EN_t=\lambda t\), \(DN_t=\lambda t\).

(ii) for \(s<t\),

\[ \begin{align*} r_N(t,s)&=E(N_t N_s)\\ &=E(N_s^2)+E(N_{t-s})E(N_s)\\ &=\lambda^2 s^2+\lambda s+\lambda^2 s(t-s)\\ &=\lambda^2 ts+\lambda s. \end{align*} \]

so for arbitrary \(s,t\), \(r_N(t,s)=\lambda \min\{t,s\}+\lambda^2 ts\).

(iii) for \(s,t\)

\[ \begin{align*} C_N(t,s)&=E(N_tN_s)-E(N_t)E(N_s)\\ &=\min\{t,s\}+\lambda^2 ts - \lambda^2 ts\\ &=\min\{t,s\}. \end{align*} \]

Assume \(S_n\) to denote the arriving time of the \(n\)th "customer", and \(T_n\) to denote the time interval of the arriving time of \(n\)th and \(n-1\)th customer.

Process of Interval Time

For \(T_n\), we have the following theorem.

Theorem of interval time

The sequence of interval time \(\{T_n\}\) in Poisson process is an independent sequence with the same distribution, which in particular, follows an exponential distribution with a mean of \(1/\lambda\).

  • For \(T_1\), we have a simple solution. For \(t\leq 0\), \(P(T_1<t)=0\); for \(t>0\),
\[ P(T_1\leq t)=P(N_{[0,t)}\geq 1)=P(N_{[0,t]}\geq 1)=1-P(N_{[0,t]}=0)=1-e^{-\lambda t}, \]

so \(f(T_1\leq t)=\lambda e^{-\lambda t}\), then \(T_1\) is an exponential distribution.

  • For \(T_n\), we have two methods. The first one, is using the conditional probability to show that
\[ P(T_n\leq t \mid T_1=\tau_1, \cdots, T_{n-1}=\tau_{n-1}) \]

is irrelevant to \(\tau_i\), \(i=1,\cdots,n-1\), so that they are mutually independent. Actually, For \(t, s>0\),

\[ \begin{align*} P(T_n>t \mid S_{n-1}=s)&=P(N_{[0,s+t]}\leq n-1\mid N_{[0,s]}=n-1, N_{[0,s)}\leq n-2)\\ &=P(N_{(s,s+t]}=0)\\ &=P(N_{[s,s+t]}=0)=e^{-\lambda t}. \end{align*} \]
  • The second method is to use induction. The logic is similar.

\(\square\)

Actually, the reverse of the above theorem also holds.

Reverse of theorem of interval time

Assume \(\{T_n: n\geq 1\}\) is a sequence of the arriving time, which are independent mutually and follow an exponential distribution of a mean of \(1/\lambda\), and let \(N_t\) to denote the number of customers arriving during time \([0,t]\), i.e.

\[ N_t:=\max\{n\geq 1: \sum_{i=1}^n T_i\leq t\}. \]

then \(N_t\) is a Poisson process with a parameter \(\lambda\).

We would use Theorem of arriving time. \(S_n=\sum_{i=1}^n T_i\).

For \(t>0\), we use total probability formula in continuous form,

\[ \begin{align*} P(N_t=k)&=P(S_k\leq t<S_{k+1})\\ &=P(S_{k}\leq t, t< T_{k+1}+S_k)\\ &=\int_{0}^t P(t<T_{k+1}+s \mid S_{k}=s)f_{S_k}(s)ds\quad \text{tpf for } S_k\in[0,t]\\ &=\int_{0}^tP(t-s<T_{k+1})f_{S_k}(s)ds \quad \text{independence}\\ &=\int_{0}^t e^{-\lambda (t-s)}\frac{\lambda^n s^{k-1}}{(k-1)!}e^{-\lambda s}ds\\ &=e^{-\lambda t}\frac{\lambda^k t^{k}}{k!} \end{align*} \]

which is exactly Poisson distribution with para \(\lambda t\).

Then we prove that \(N_t\) satisfies independent increment and stationary increment. For \(t,s\),

\[ \begin{align*} P(N_t-N_s=n)&=\sum_{k=1}^\infty P(N_t-N_s=n, N_s=k)\\ &=\sum_{k=1}^\infty P(N_t=n+k, N_s=k) \end{align*} \]

still follows Poisson distribution.

\(\square\)

Process of Arriving Time

Theorem of arriving time

The sequence of arriving time \(\{S_n\}\) in Poisson process satisfies \(S_n\sim \Gamma (n,\lambda)\), i.e.

\[ f_{S_n}(t)=\begin{cases} \displaystyle\frac{\lambda^n t^{n-1}}{\Gamma(n)}e^{-\lambda t}=\frac{\lambda^n t^{n-1}}{(n-1)!}e^{-\lambda t},\quad &t>0,\\ 0,\quad &t\leq 0. \end{cases} \]

By \(S_n=\sum_{i=1}^n T_i\), and \(\{T_i\}\) are independent nutually and follow exponential distribution of a mean \(1/\lambda\), then using characteristic function

\[ \phi_{S_n}(u)=E(e^{juS_n})=E(e^{ju\sum_i^n T_i})=\prod_{i=1}^n E(e^{ju T_i})=\prod_{i=1}^n \left(1-\frac{ju}{\lambda}\right)^{-1}=\left(1-\frac{ju}{\lambda}\right)^{-n} \]

which is the characteristic function of a \(\Gamma\) distribution.

\(\square\)

For \(t>0\), we consider

\[ \begin{align*} P(S_n\leq t)&=1-P(S_n>t)\\ &=1-P(N_t\leq n-1)\\ &=1-\sum_{i=0}^{n-1}P(N_t=i)\\ &=1-\sum_{i=0}^{n-1}\frac{e^{-\lambda t}(\lambda t)^{i}}{i!} \end{align*} \]

taking derivative, and we have

\[ \begin{align*} f_{S_n}(t)&=-e^{-\lambda t}\sum_{i=0}^{n-1}\left(\frac{-\lambda^{i+1}t^i}{i!}+\frac{\lambda^it^{i-1}}{(i-1)!}\right)\quad\text{denote }t^{0-1}=0 \\ &=e^{-\lambda t}\sum_{i=0}^{n-1}\left(\frac{\lambda^{i+1}t^i}{i!}-\frac{\lambda^it^{i-1}}{(i-1)!}\right)\\ &=e^{-\lambda t}\frac{\lambda^n t^{n-1}}{(n-1)!}. \end{align*} \]

\(\square\)

We usually use the following conditional form of \(S_n\).

Properties of conditional probability of arriving time

Assume \(N_t\) is a Poisson process of a parameter \(\lambda\), \(\{S_n\}\) is its arriving sequence, then

(i) Looking forward. \(\forall n\), \(0<t_1<\cdots<t_n\), and integers \(0\leq k_1\leq \cdots \leq k_n\), we have

\[ \begin{align*} P&(N_{t_1}=k_1,\cdots, N_{t_n}=k_n)\\ &=P(N_{t_1}=k_1,N_{t_2-t_1}=k_2-k_1,\cdots, N_{t_n-t_{n-1}}=k_n-k_{n-1})\\ &=P(N_{t_1}=k_1)P(N_{t_2-t_1}=k_2-k_1)\cdots P(N_{t_n-t_{n-1}}=k_n-k_{n-1})\\ &=e^{\lambda t_1}\frac{(\lambda t_1)^k_1}{k_1 !}\cdot \prod_{i=2}^n e^{-\lambda(t_i-t_{i-1})}\frac{[\lambda (t_i-t_{i-1})]^{k_i-k{i-1}}}{(k_i-k_{i-1})!}. \end{align*} \]

(ii) Looking backward. \(\forall 0<x<t\) and \(1\leq j\leq n\), we have

\[ P(S_j<x \mid N_{t}=n)=\sum_{k=j}^nC_n^k \left(\frac{x}{t}\right)^k\left(1-\frac{x}{t}\right)^{n-k}. \]

Use the formula of conditional probability and apply the result of (i).

\[ \begin{align*} P(S_j<x\mid N(t)=n)&=\frac{P(S_j\leq x, N(t)=n)}{P(N(t)=n)}\\ &=P(N(x)\geq j, N(t)=n)/P(N(t)=n)\\ &=\sum_{k=j}^n \frac{(\lambda x)^k}{k!} \frac{[\lambda(t-x)]^{n-k}}{(n-k)!}/\frac{(\lambda t)^n}{n!}\\ &=\sum_{k=j}^n C_n^k \left(\frac{x}{t}\right)^k\left(1-\frac{x}{t}\right)^k. \end{align*} \]

Order Statistics

Assume random variables \(X_1,\cdot,X_n\) are mutually independent and follow the same distribution, with a density function \(f\). If we sort them from small to large in an order, we have

\[ X_{(1)}\leq\cdots\leq X_{(n)}, \]

then random vector \((X_{(1)},\cdots,X_{(n)})\) has a density function

\[ g(x_1,\cdots,x_n)=\begin{cases} n! f(x_1)\cdots f(x_2),\quad &x_1<x_2<\cdots<x_n\\ 0,\quad & \text{others}. \end{cases} \]

By definition of density function.

\(\forall x_1<\cdots<x_n\), \(\exists \varepsilon_i>0 (i=1,c\dots,n-1)\), such that \(x_i+\varepsilon_i<x_{i+1}\) and \(\exists \varepsilon_n>0\). Then

\[ \begin{align*} P&(x_i<X_{(i)}\leq x_i+\varepsilon_i, \forall i\in [1,n])\\ &=\sum_{\tau \text{ is a permutation} \atop\text{of }\{1,\cdots,n\}}P(x_i<X_{\tau_i}\leq x_i+\varepsilon_i, \forall i\in [1,n])\\ &=n! P(x_i<X_{i}\leq x_i+\varepsilon_i, \forall i\in [1,n])\\ &=n!\prod_{i=1}^n P(x_i<X_{i}\leq x_i+\varepsilon_i)\quad \text{by independence}\\ \Rightarrow \quad g(x_1,\cdots,x_n)&=\lim_{\max\limits_{1\leq i\leq n}\{\varepsilon_i\}\rightarrow 0} \frac{n!\prod\limits_{i=1}^n P(x_i<X_{i}\leq x_i+\varepsilon_i)}{\varepsilon_1\cdots\varepsilon_n}\\ &=n! \prod_{i=1}^n f(x_i). \end{align*} \]

Distribution of arriving vector in conditional probability

Assume \(N_t\) is a Poisson process, with arriving time \(S_n\). \(\forall t>0\)m \(n\in \mathbb{N}^+\),

\[ (S_1,\cdots,S_n \mid N_t=n)\overset{d}{=}(U_{(1)},\cdots,U_{(n)}) \]

where \(U_{(1)},\cdots,U_{(n)}\) are order statistics which are mutually independent and follow the same uniform distribution \(U(0,t)\).

\(\forall n\in \mathbb{N}^+\), \(t_1<s_1<t_2<s_2<\cdots<t_n<s_n\), we have

\[ \begin{align*} P&(S_1\in (t_1,s_1],\cdots,S_n\in (t_n,s_n] \mid N_t=n)\\ &=\frac{P(S_i\in (t_i,s_i] (\forall i\in[1,n]), N_t=n)}{P(N_t=n)}\\ &=\frac{P[N_{(t_i,s_i]}=1(\forall i\in[1,n]), N_{t_1}=0, N_{(t_i,s_{i+1}]}=0(\forall i\in [1,n-1]), N_{(s_{i}, t]}=0]}{P(N_t=n)}\\ &=\frac{\left\{\prod\limits_{i=1}^n P(N_{(t_i,s_i]}=1)\right\} P(N_{t_1}=0) \left\{ \prod\limits_{i=1}^{n-1}P(N_{(s_i,t_{i+1}]}=0)\right\} P(N_{(s_{i}, t]}=0) }{P(N_t=n)}\\ &=\frac{e^{-\lambda t} \prod\limits_{i=1}^n(\lambda (s_i-t_i))}{e^{-\lambda t} \frac{(\lambda t)^n}{n!}}\\ &=\frac{n!\prod\limits_{i=1}^n(s_i-t_i)}{t^n} \end{align*} \]

So we have density function

\[ g(x_1,\cdots,x_n\mid N_t=n)=\lim_{\max\limits_{1\leq i\leq n}\{t_i-s_i\}\rightarrow 0}\frac{\frac{n!\prod\limits_{i=1}^n(s_i-t_i)}{t^n}}{\prod\limits_{i=1}^n(s_i-t_i)}=\frac{n!}{t^n} \]

which is exactly the distribution of order statistics of uniform distribution.

Actually, if we do not distinguish \(S_1,\cdots,S_n\), then the \(n\) customers arrive randomly with a distribution of \(U(0,t)\), which are mutually independent.

Repair Model

\(N_t\) could be interpreted as a process of number of the update of components. That is, \(N_t\) could be the number of the components updated during time \([0,t]\), and \(T_n\) could be the lifespan of the \(n\)th component, \(\lambda\) means the average number of the components updated in a unit time.

Definition of the ages of components

For all \(t>0\), \(t\neq S_n\), \(n=0,1,\cdots\), we denote \(S_0=0\). Define

\[ \alpha_t=t-S_{N_t}, \quad \beta_t=S_{N_{t}+1}-t \]

to be the age of the component at time \(t\), and the rest age of that at time \(t\), respectively.

Properties of age

(i) \(\alpha_t\) follows

\[ F_{\alpha_t}(x)=\begin{cases} 0,\quad &x\leq 0,\\ 1-e^{-\lambda x},\quad & 0<x\leq t,\\ 1,\quad & x>t. \end{cases} \]

(ii) \(\beta_t\) is independent of \(t\), and follows an exponential distribution of a mean of \(1/\lambda\).

(iii) \(\alpha_t\) and \(\beta_t\) are independent.

Notice the following proof is based on a given \(t\).

(i) Whenever \(x\in [0,t]\), we have

\[ P(\alpha_t<x)=P(t-S_{N_{t}}<x)=P(S_{N_t}>t-x)=P(N_t-X_{t-x}>0)=1-e^{-\lambda x}. \]

(ii) By definition.

\[ P(\beta_t<x)=P(S_{N_{t}+1}<x)=P(N_{t+x}-N_t\geq 1)=1-e^{-\lambda x}. \]

Synthesis & Decomposition

Synthesis of Poisson Process

Assume \(N_1(t)\) and \(N_2(t)\) are mutually independent Poisson processes, with parameter \(\lambda_1\) and \(\lambda_2\) respectively. Then process \(N(t)=N_1(t)+N_2(t)\) is also a Poisson process with parameter \(\lambda_1+\lambda_2\).

(i) \(N(0)=0\).

(ii) Independent increment property. Here \(\forall n\),

\[ \begin{cases} \text{Group 1: }N_1(t_2)-N_1(t_1),\cdots,N_1(t_n)-N_1(t_{n-1}),\\ \text{Group 2: }N_2(t_2)-N_2(t_1),\cdots,N_2(t_n)-N_2(t_{n-1}),\\ \end{cases} \]

the independence in group 1 and 2 is apparent. Since \(N_1(t)\) and \(N_2(t)\) are mutually independent Poisson processes, Group 1 and 2 are also independent with each other. So we have the above random variables independent mutually.

(iii) Follows Poisson distribution. We have two methods. The first one is to use properties of Poisson distribution. That is, since \(N_1(t)-N_1(s)\), \(N_2(t)-N_2(s)\) follow Poisson distribution of \(\pi (\lambda_1 (t-s))\) and \(\pi(\lambda_2(t-s))\), respectively, and they are mutually independent, so by the additivity of Poisson distribution, we have

\[ N(t)-N(s)=[N_1(t)-N_1(s)]+[N_2(t)-N_2(s)]\sim \pi((\lambda_1+\lambda_2)(t-s)) \]

also follows Poisson distribution.

The second method is to directly use distribution function to show that \(N(t)-N(s)\) follows Poisson distribution.

Decomposition of Poisson Process

Assume \(\{N(t)\}\) is a Poisson process with a parameter \(\lambda\), every customer arrives in different type independently, with each type happening in a probability \(p_i(i=1,\cdots,n, \sum\limits_{i=1}^n p_i=1)\), denote \(N_i(t)\) to be the number of arriving customer of the certain type \(i\), then \(N_i(t)\) is also a Poisson process, with a parameter \(p_i\lambda\), respectively and they are mutually independent.

Nonhomogeneous Poisson Process

Definition of Nonhomogeneous Poisson Process

A counting process is called a nonhomogeneous Poisson process, with an intensity of \(\lambda(t)\), if

(i) \(N(0)=0\).

(ii) \(N(t)\) is an independent increment process.

(iii) \(P(N(t+h)-N(t)=1)=\lambda(t) h+o(h)\).

(iv) \(P(N(t+h)-N(t)\geq 2)=o(h)\).

Theorem of Nonhomogeneous Poisson Process

A counting process is called a nonhomogeneous Poisson process, with an intensity of \(\lambda(t)\), iff

(i) (ii) the same with definition.

(iii) \(\forall 0\leq s<t\),

\[ N_t-N_s\sim \pi\left(\int_s^t \lambda(\tau)d\tau\right). \]

Apparently, nonhomogeneous Poisson process does not follow stationary increment, but still follow independent increment.

Time transfer

If \(\{Y(t)\}\) is a homogeneous Poisson process with para \(\lambda=1\), \(\Lambda(t)=\int_0^t \lambda(\tau)d\tau\), then

\[ X(t)=Y(\Lambda(t)),\quad t\geq 0, \]

is a nonhomogeneous Poisson process, with intensity function \(\lambda(t)\).

Note for \(t,s\),

\[ \begin{align*} X(t)-X(s)&=Y(\Lambda(t))-Y(\Lambda(s))\\ &\sim \pi \left[1\cdot(\Lambda(t)-\Lambda(s))\right]\\ &=\pi\left(\int_s^t \lambda(\tau)d\tau\right). \end{align*} \]

Compound Poisson Process

Definition of Compound Poisson process

Assume \(N_t\) is a Poisson process with para \(\lambda\). We call the following process \(Y(t)\) compound Poisson Process, if

\[ Y(t):=\sum_{n=1}^{N(t)}X_n \]

where \(\{X_n\}\) following the same distribution, are independent mutually and with \(N(t)\), which is usually called Jump Sizes.

Distribution of compound Poisson process

For \(t,s\geq 0\), the characteristic function of \(Y(t)-Y(s)\) is

\[ \phi_{Y(t)-Y(s)}(u)=e^{-\lambda(t-s)}e^{\lambda (t-s)\phi_{X_1}(u)}. \]
\[ \begin{align*} \phi_{Y(t)-Y(s)}&=E(e^{iu[Y(t)-Y(s)]})\\ \displaystyle &=E\left(e^{iu\left(\sum\limits_{i=N(s)}^{N(t)} X_i\right)}\right)\\ &=\sum_{n=1}^\infty E\left(e^{iu\left(\sum\limits_{i=1}^{n} X_i\right)}\right) P(N(t)-N(s)=n)\\ &=\sum_{n=1}^\infty [\phi_{X_1}(u)]^n \frac{e^{-\lambda(t-s)}[\lambda(t-s)]^n}{n!}\\ &=e^{\lambda(t-s)(\phi_{X_1}(u)-1)} \end{align*} \]

\(\square\)

Properties of Compound Poisson Process

\(Y(t)\) is stationary increment and independent incerment.

  • Independent increment. Easy to see, by definition. \(\forall n\), \(0<t_1<t_2<\cdots<t_n\),
\[ Y(t_2)-Y(t_1),\quad \cdots, \quad Y(t_n)-Y(t_{n-1}) \]

i.e.

\[ \sum_{i=N(t_1)}^{N(t_2)}X_i,\quad \cdots, \quad\sum_{i=N(t_{n-1})}^{N(t_n)}X_i \]

are mutually independent because none of the above random variable has a common \(X_n\).

ME of Compound Poisson Process

If \(E(X_1^2)<\infty\), denote \(\mu=E(X_1)\), \(\sigma^2=D(X_1)\), then

\[ E(Y(t))=\lambda t E(X_1)=\mu \lambda t, \quad D(Y(t))=\lambda t E(X_1^2)=(\mu^2+\sigma^2)\lambda t, \]

and for \(t,s\),

\[ R_Y(t,s)=E(X_1^2)[\lambda \min(t,s)+\lambda^2ts],\quad C_Y(t,s)=E(X_1^2)\lambda \min(t,s). \]
\[ \begin{align*} E(Y(t))&=E\left(\sum_{n=0}^{N(t)}X_n\right)\\ &=E\left(E\left(\sum_{n=0}^{N(t)}X_n \mid N(t)\right)\right)\\ &=\sum_{n=0}^\infty E\left(\sum_{i=1}^{n}X_i \mid N(t)=n\right)P(N(t)=n)\\ &=\sum_{n=0}^\infty nE(X_1)\frac{e^{-\lambda t}(\lambda t)^n}{n!}\\ &=E(X_1)e^{-\lambda t} \lambda t \sum_{n=1}^\infty \frac{(\lambda t)^{n-1}}{(n-1)!}\\ &=E(X_1)\lambda t. \end{align*} \]

Similarly,

\[ \begin{align*} E(Y^2(t)) \end{align*} \]

or use conditional variance

\[ \begin{align*} D(Y(t))&=E(D(Y(t)\mid N(t)))+D(E(Y(t)\mid N(t)))\\ &=\sum_{n=0}^\infty D\left(\sum_{i=1}^{n}X_i \mid N(t)=n\right)P(N(t)=n)+D(N(t)E(X_1))\\ &=\sum_{n=0}^\infty n\sigma^2P(N(t)=n) + E^2(X_1)\lambda t\\ &=\sigma^2\lambda t+\mu^2 \lambda t. \end{align*} \]

So

\[ \begin{align*} R_Y(t,s)&=E(Y(t)Y(s))\\ &=E\left( \right) \end{align*} \]

Filtered Poisson Process

Definition of filtered Poisson process

Assume stochastic process \(\{Y(t): t\geq 0\}\) is a filtered Poisson process, if

\[ Y(t)=\sum_{n=1}^{N(t)}W(t, S_n, X_n), \quad t\geq 0. \]

where \(\{N(t): t\geq 0\}\) is a Poisson process with parameter \(\lambda\), \(S_n\) are arriving time of \(n\)th event, \(\{X_n: n\geq 1\}\) are mutually independent random variables, and independent with \(N(t)\), \(W(t, S_n, X_n)\) are three-variable function, called response function.

Usually \(X_n\) is the magnitude of a signal associated with \(n\)th event, and \(W\) represents the response of signal with magnitude \(X_n\) at time \(t\), which starts from \(S_n\).

Common form of response function

We denote \(W(t,\tau,x)=W_0(t-\tau,x)\), then

(i)

\[ W_0(s,x)=\begin{cases} 1,\quad &0<s<x,\\ 0,\quad &\text{others}. \end{cases} \]

(ii)

\[ W_0(s,x)=\begin{cases} x-s,\quad &0<s<x,\\ 0,\quad &\text{others}. \end{cases} \]

(iii)

\[ W_0(s,x)=\begin{cases} xW_1(x),\quad &0\leq s,\\ 0,\quad &s<0. \end{cases} \]