Possion Process¶
Definition of Possion Process
Assume Stachostic process \(\{N_t: t\geq 0\}\) has state space \(S=\mathbb{N}^+\). If it satisfies
(i) \(N(0)=0\).
(ii) independent increment (additive).
(iii) \(\forall s,t\geq 0\), \(N(s+t)-N(s) \sim \pi(\lambda t),\lambda>0\),
then we call \(\{N_t\}\) is a Possion Process.
Equivalent Definition of Possion Process
Assume Stachostic process \(\{N_t: t\geq 0\}\) is a Possion process, iff it satisfies
(i) \(N(0)=0\).
(ii) independent increment (additive).
(iii) \(\mathbb{P}(N(h+t)-N(t)=1)=\lambda h + o(h),\quad h>0\).
(iv) \(\mathbb{P}(N(h+t)-N(t)\geq 2)=o(h), \quad h>0\).
Easy to see. Use \(e^{-\lambda h}=1-\lambda h+o(h)\), we have
and
It is a little hard to prove this direction. For we have to show that \(N(t+h)-N(t)\) is a Possion distribution.
Notice the following theorem for continuous random variables.
Probability at a point equals zero
which means \(P(N_{[0,t)}=N_{[0,t]})=1\).
Later, we focus on the number on intervals without considering carefully about the boundary.
Mathematical Characteristic¶
Here we use generating function to deduce its mathematical characteristics. Recall that for random variable \(\xi\) which follows Possion distribution with para \(\lambda\), we have
Note that for ME, we have
Using Levy Theorem, we let \(s\nearrow 1\), then \(E(\xi)=\lambda\), \(E(\xi^2-\xi)=\lambda^2\), so \(D(\xi)=\lambda\). As for Possion process, we have
Mathematical characteristic of Possion Process
(i) \(EN_t=\lambda t\), \(DN_t=\lambda t\).
(ii) for \(s<t\),
so for arbitrary \(s,t\), \(r_N(t,s)=\lambda \min\{t,s\}+\lambda^2 ts\).
(iii) for \(s,t\)
Assume \(S_n\) to denote the arriving time of the \(n\)th "customer", and \(T_n\) to denote the time interval of the arriving time of \(n\)th and \(n-1\)th customer.
Process of Interval Time¶
For \(T_n\), we have the following theorem.
Theorem of interval time
The sequence of interval time \(\{T_n\}\) in Possion process is an independent sequence with the same distribution, which in particular, follows an exponential distribution with a mean of \(1/\lambda\).
- For \(T_1\), we have a simple solution. For \(t\leq 0\), \(P(T_1<t)=0\); for \(t>0\),
so \(f(T_1\leq t)=\lambda e^{-\lambda t}\), then \(T_1\) is an exponential distribution.
- For \(T_n\), we have two methods. The first one, is using the conditional probability to show that
is irrelevant to \(\tau_i\), \(i=1,\cdots,n-1\), so that they are mutually independent. Actually, For \(t, s>0\),
- The second method is to use induction. The logic is similar.
\(\square\)
Actually, the reverse of the above theorem also holds.
Reverse of theorem of interval time
Assume \(\{T_n: n\geq 1\}\) is a sequence of the arriving time, which are independent mutually and follow an exponential distribution of a mean of \(1/\lambda\), and let \(N_t\) to denote the number of customers arriving during time \([0,t]\), i.e.
then \(N_t\) is a Possion process with a parameter \(\lambda\).
We would use Theorem of arriving time. \(S_n=\sum_{i=1}^n T_i\).
For \(t>0\), we use total probability formula in continuous form,
which is exactly Possion distribution with para \(\lambda t\).
Then we prove that \(N_t\) satisfies independent increment and stationary increment. For \(t,s\),
still follows Possion distribution.
\(\square\)
Process of Arriving Time¶
Theorem of arriving time
The sequence of arriving time \(\{S_n\}\) in Possion process satisfies \(S_n\sim \Gamma (n,\lambda)\), i.e.
By \(S_n=\sum_{i=1}^n T_i\), and \(\{T_i\}\) are independent nutually and follow exponential distribution of a mean \(1/\lambda\), then using characteristic function
which is the characteristic function of a \(\Gamma\) distribution.
\(\square\)
For \(t>0\), we consider
taking derivative, and we have
\(\square\)
We usually use the following conditional form of \(S_n\).
Properties of conditional probability of arriving time
Assume \(N_t\) is a Possion process of a parameter \(\lambda\), \(\{S_n\}\) is its arriving sequence, then
(i) Looking forward. \(\forall n\), \(0<t_1<\cdots<t_n\), and integers \(0\leq k_1\leq \cdots \leq k_n\), we have
(ii) Looking backward. \(\forall 0<x<t\) and \(1\leq j\leq n\), we have
Use the formula of conditional probability and apply the result of (i).
Order Statistics
Assume random variables \(X_1,\cdot,X_n\) are mutually independent and follow the same distribution, with a density function \(f\). If we sort them from small to large in an order, we have
then random vector \((X_{(1)},\cdots,X_{(n)})\) has a density function
By definition of density function.
\(\forall x_1<\cdots<x_n\), \(\exists \varepsilon_i>0 (i=1,c\dots,n-1)\), such that \(x_i+\varepsilon_i<x_{i+1}\) and \(\exists \varepsilon_n>0\). Then
Distribution of arriving vector in conditional probability
Assume \(N_t\) is a Possion process, with arriving time \(S_n\). \(\forall t>0\)m \(n\in \mathbb{N}^+\),
where \(U_{(1)},\cdots,U_{(n)}\) are order statistics which are mutually independent and follow the same uniform distribution \(U(0,t)\).
\(\forall n\in \mathbb{N}^+\), \(t_1<s_1<t_2<s_2<\cdots<t_n<s_n\), we have
So we have density function
which is exactly the distribution of order statistics of uniform distribution.
Actually, if we do not distinguish \(S_1,\cdots,S_n\), then the \(n\) customers arrive randomly with a distribution of \(U(0,t)\), which are mutually independent.
Repair Model¶
\(N_t\) could be interpreted as a process of number of the update of components. That is, \(N_t\) could be the number of the components updated during time \([0,t]\), and \(T_n\) could be the lifespan of the \(n\)th component, \(\lambda\) means the average number of the components updated in a unit time.
Definition of the ages of components
For all \(t>0\), \(t\neq S_n\), \(n=0,1,\cdots\), we denote \(S_0=0\). Define
to be the age of the component at time \(t\), and the rest age of that at time \(t\), respectively.
Properties of age
(i) \(\alpha_t\) follows
(ii) \(\beta_t\) is independent of \(t\), and follows an exponential distribution of a mean of \(1/\lambda\).
(iii) \(\alpha_t\) and \(\beta_t\) are independent.
Notice the following proof is based on a given \(t\).
(i) Whenever \(x\in [0,t]\), we have
(ii) By definition.
Synthesis & Decomposition¶
Synthesis of Possion Process
Assume \(N_1(t)\) and \(N_2(t)\) are mutually independent Possion processes, with parameter \(\lambda_1\) and \(\lambda_2\) respectively. Then process \(N(t)=N_1(t)+N_2(t)\) is also a Possion process with parameter \(\lambda_1+\lambda_2\).
(i) \(N(0)=0\).
(ii) Independent increment property. Here \(\forall n\),
the independence in group 1 and 2 is apparent. Since \(N_1(t)\) and \(N_2(t)\) are mutually independent Possion processes, Group 1 and 2 are also independent with each other. So we have the above random variables independent mutually.
(iii) Follows Possion distribution. We have two methods. The first one is to use properties of Possion distribution. That is, since \(N_1(t)-N_1(s)\), \(N_2(t)-N_2(s)\) follow Possion distribution of \(\pi (\lambda_1 (t-s))\) and \(\pi(\lambda_2(t-s))\), respectively, and they are mutually independent, so by the additivity of Possion distribution, we have
also follows Possion distribution.
The second method is to directly use distribution function to show that \(N(t)-N(s)\) follows Possion distribution.
Decomposition of Possion Process
Assume \(\{N(t)\}\) is a Possion process with a parameter \(\lambda\), every customer arrives in different type independently, with each type happening in a probability \(p_i(i=1,\cdots,n, \sum\limits_{i=1}^n p_i=1)\), denote \(N_i(t)\) to be the number of arriving customer of the certain type \(i\), then \(N_i(t)\) is also a Possion process, with a parameter \(p_i\lambda\), respectively and they are mutually independent.
Nonhomogeneous Poisson Process¶
Definition of Nonhomogeneous Poisson Process
A counting process is called a nonhomogeneous Possion process, with an intensity of \(\lambda(t)\), if
(i) \(N(0)=0\).
(ii) \(N(t)\) is an independent increment process.
(iii) \(P(N(t+h)-N(t)=1)=\lambda(t) h+o(h)\).
(iv) \(P(N(t+h)-N(t)\geq 2)=o(h)\).
Theorem of Nonhomogeneous Poisson Process
A counting process is called a nonhomogeneous Possion process, with an intensity of \(\lambda(t)\), iff
(i) (ii) the same with definition.
(iii) \(\forall 0\leq s<t\),
Apparently, nonhomogeneous Possion process does not follow stationary increment, but still follow independent increment.
Time transfer
If \(\{Y(t)\}\) is a homogeneous Possion process with para \(\lambda=1\), \(\Lambda(t)=\int_0^t \lambda(\tau)d\tau\), then
is a nonhomogeneous Possion process, with intensity function \(\lambda(t)\).
Note for \(t,s\),
Compound Possion Process¶
Definition of Compound Possion process
Assume \(N_t\) is a Possion process with para \(\lambda\). We call the following process \(Y(t)\) compound Possion Process, if
where \(\{X_n\}\) following the same distribution, are independent mutually and with \(N(t)\), which is usually called Jump Sizes.
Distribution of compound Possion process
For \(t,s\geq 0\), the characteristic function of \(Y(t)-Y(s)\) is
\(\square\)
Properties of Compound Possion Process
\(Y(t)\) is stationary increment and independent incerment.
- Independent increment. Easy to see, by definition. \(\forall n\), \(0<t_1<t_2<\cdots<t_n\),
i.e.
are mutually independent because none of the above random variable has a common \(X_n\).
- Stationary increment. Use Characteristic function. From Distribution of compound Possion process, we could see that its characteristic function is only dependent on \(t-s\).
ME of Compound Possion Process
If \(E(X_1^2)<\infty\), denote \(\mu=E(X_1)\), \(\sigma^2=D(X_1)\), then
and for \(t,s\),
Similarly,
or use conditional variance
So
Filtered Poisson Process¶
Definition of filtered possion process
Assume stochastic process \(\{Y(t): t\geq 0\}\) is a filtered possion process, if
where \(\{N(t): t\geq 0\}\) is a possion process with parameter \(\lambda\), \(S_n\) are arriving time of \(n\)th event, \(\{X_n: n\geq 1\}\) are mutually independent random variables, and independent with \(N(t)\), \(W(t, S_n, X_n)\) are three-variable function, called response function.
Usually \(X_n\) is the magnitude of a signal associated with \(n\)th event, and \(W\) represents the response of signal with magnitude \(X_n\) at time \(t\), which starts from \(S_n\).
Common form of response function
We denote \(W(t,\tau,x)=W_0(t-\tau,x)\), then
(i)
(ii)
(iii)