Processes in continuous time¶
The Poisson process¶
Definition (Poisson process) A Poisson process \(N_t\) is a stochastic process with a continuous index set \(\mathcal{I} = [0, \infty)\) which
Takes values in the range \(\{0, 1, 2, ... \}\),
Is initially zero \(N_0 = 0\),
Is nondecreasing \(s \leq t \implies N_s \leq N_t\),
Satisfies the independence relation \(0 \leq s < t \implies N_s \perp \!\!\! \perp N_t - N_s\).
For some number \(\lambda > 0\), called the arrival rate, it satisfies
Theorem (Marginal distribution of a Poisson process) Let \(N_t\) be a Poisson process with rate \(\lambda\). Then for any \(t > 0\)
Below is the textbook proof for the above theorem.
Proof (a): Marginal distribution of a Poisson process
From the definition of a Poisson process, we have
Using the notation \(p_k(t) = \mathbb{P}(N_t = k)\), we obtain the differential equation
for \(n \geq 1\), while for \(n = 0\) we obtain
Considering the generating function of \(N_t\)
and taking sums over both sides of the differential equation, we obtain
Adding each side of the equation \(p_0'(t) = - \lambda p_0(t)\) to each side of the above equations, we obtain
and arrive at the differential equation
Integrating the differential equation with respect to \(t\) we obtain
The boundary condition
implies \(\log G(s, 0) = 0 = A(s)\), so
This, together with the uniqueness theorem for moments implies that
An alternative proof that avoids differential equations is given below.
Proof (b): Marginal distribution of a Poisson process
Consider the sequence of random variables \(Z_1(t), Z_2(t),~...\) defined by
where \(B_{N, n}\) are i.i.d. Bernoulli-distributed random variables with parameter \(p = \lambda \frac{t}{N}\). The first part of this proof shows that the sequence \(Z_N(t)\) converges in distribution to \(N_t\), and the second shows that the limiting distribution is Poisson with parameter \(\lambda t\).
First, we show that the sequence \(Z_N(t)\) converges in distribution to \(N_t\). By the definition of \(Z(t)\), we have
where the sum is over \(\mathcal{B}_N\), the set of all binary \(N\)-tuples \(\mathcal{u}_N = (b_{N, 1}, b_{N, 2}, ..., b_{N, N})\) such that \(b_{N, n} \in \{0, 1\}\) and \(\sum_{n = 1}^N b_{N, n} = k\). Then from the definition of \(N_t\)
where the sum is over \(\mathcal{C}_N\), the set of all binary \(N\)-tuples \(\mathcal{v}_N = (c_{N, 1}, c_{N, 2}, ..., c_{N, N})\) such that \(c_{N, 1} \in \{0, 1\}\), \(c_{N, N} = k\) and \(c_{N, n + 1} - c_{N, n} \in \{0, 1\}\) for \(n > 1\). Noting the correspondence between the sets \(\mathcal{B}_N\) and \(\mathcal{C}_N\) and their elements, and taking the limit \(N \to \infty\) we obtain
We now show that \(Z(t) = \lim_{N \to \infty} Z_N(t)\) is Poisson-distrbuted. The characteristic function of \(B_{N, n}\) is
Using the independence of the \(B_{N, n}\) variables and the independence property of the characteristic function we obtain
As \(N \to \infty\), the sequence of characteristic functions converges to the limit
which is the characteristic function of a random variable \(Z(t)\) that is Poisson-distributed with parameter \(\lambda t\), so \(N_t\) is also Poisson-distributed with parameter \(\lambda t\).
Arrival and Inter-arrival times¶
Definition (Arrival and inter-arrival times of the Poisson process) Let \(N_t\) be a Poisson process with rate \(\lambda\). The arrival times \(T_0, T_1, ...\) of \(N_t\) are defined as \(T_0 = 0\) and
The inter-arrival times \(X_1, X_2, ...\) of the process are defined as
Theorem (Inter-arrival times of the Poisson process) Let \(N_t\) be a Poisson process with rate \(\lambda\). The inter-arrival times \(X_1, X_2, ...\) are independent, each having the exponential distribution with parameter \(\lambda\).
Below is an adapted version of the proof in Grimmett and Stirzaker, Probability and Random Processes.[GS01]
Proof: Inter-arrival times of the Poisson process
Let \(N_t\) be a Poisson process with parameter \(\lambda\) and inter-arrival times \(X_1, X_2, ...\). The event \(\{X_1 > t\}\) has probability
and \(X_1\) is therefore exponentially distributed with parameter \(\lambda\). Now considering \(X_2\), we have
where to go from the first to the second line we used the following fact. The event \(\{X_1 = t_1\}\) is determined by the number of arrivals occuring during \([0, t_1]\) whereas the event \(\{\text{no arrivals in } (t_1, t_1 + t]\}\) is determined by the number of arrivals during \((t_1, t_1 + t]\). By the independence property of the Poisson process, the numbers of arrivals during these two intervals are independent and therefore the two events are also independent. Therefore \(X_1\) and \(X_2\) are independent and \(X_2\) is also exponentially distributed with parameter \(\lambda\). Proceeding recursively yields the result.
Lack of memory property¶
Definition (Lack-of-memory property) A positive random variable \(X\) is said to have the lack-of-memory property if
Theorem (Lack-of-memory \(\iff\) exponentially distributed) The continuous random variable \(X\) has the lack of memory property if and only if it is exponentially distributed.
Proof: Lack-of-memory \(\iff\) exponentially distributed
To see that the exponential distribution has the lack-of-memory property, let \(X\) be exponentially distributed with parameter \(\lambda\) and consider
To show that a continuous random variable \(X\) which has the lack-of-memory property, must be exponentially distributed, consider
Therefore, the continuous function \(G(\cdot) = \mathbb{P}(X > \cdot)\) satisfies
It follows that for any non-zero integer \(n\)
and similarly, for any rational \(u = \frac{a}{b}\) where \(a, b\) are integers, we have
from which it follows that \(G(u) = G(1)^u\). This relation holds for all rational \(u\), and by the continuity of \(G\) it also holds for all real numbers. Finally, since \(G(u)\) is monotonic decreasing, it must be the case that \(0 < G(1) < 1\), so defining \(\lambda = - \log G(1)\) we arrive at
so \(X\) must be exponentially distributed with some parameter \(\lambda > 0\).
Simple birth process¶
Definition (Simple birth process) A simple birth process, \(M_t\), is a stochastic process describing the evolution of a population, such that
The initial population is an integer \(I\), \(M_0 = I\),
There exists a number \(\lambda > 0\) called the birth rate, such that during the interval \((t, t + h]\), each organism of the population has a probability of dividing into two organisms, equal to \(\lambda h + o(h)\) and a probability of not dividing equal to \(1 - \lambda h + o(h)\),
For each organism at time \(t\), its future divisions are independent of its past divisions, as well as the divisions of all other organisms present at time \(t\).
Theorem (Population distribution of the simple birth process) Let \(M_t\) be a simple birth process with rate \(\lambda\), and initial population \(M_0 = I\). Then
Proof: Population distribution of the simple birth process
Let \(p_k(t) = \mathbb{P}(M_t = k)\) and consider
Subtracting \(p_k(t)\) from both sides and taking the \(h \to 0\) limit, we obtain
with \(p_{I - 1}(t) = 0\) and boundary conditions
Therefore for \(k = I\)
Birth and death process¶
Definition (Birth and death process) A birth and death process, \(L_t\), is a stochastic process describing the evolution of a population, such that
The initial population is an integer \(I\), \(M_0 = I\),
There exists a number \(\lambda > 0\) called the birth rate, such that during the interval \((t, t + h]\), each organism of the population has a probability of dividing once into two organisms, equal to \(\lambda h + o(h)\) and a probability of dividing more than once is \(o(h)\),
There exists a number \(\mu > 0\) called the death rate, such that during the interval \((t, t + h]\) each organism has a probability \(\mu h + o(h)\) of dying,
For each organism at time \(t\), its future activity (divisions or death) is independent of its past activity, as well as the activities of all other organisms present at time \(t\).
Theorem (PGF of birth and death process) Let \(L_t\) be a birth and death process with birth rate \(\lambda\) and death rate \(\mu > 0\), and initial population \(L_0 = I\). Then its pgf is
Theorem (Extinction of birth and death process) Let \(L_t\) be a birth and death process with birth rate \(\lambda\) and death rate \(\mu > 0\), and initial population \(L_0 = I\). The probability \(e(t) = \mathbb{P}(L_t = 0)\) that the probability is extinct by time \(t\) tends to
Proof: Extinction of birth and death process
Let \(L_t\) be a birth and death process with birth rate \(\lambda\) and death rate \(\mu > 0\), and initial population \(L_0 = I\). If the pgf at time \(t\) is \(G(s, t)\), then \(G(0, t) = \mathbb{P}(L_t = 0)\) and
Letting \(t \to \infty\) we arrive at the result.
First come, first served queue¶
Definition (First come, first served queue) A first come, first served queue \(Q_t\), is a stochastic process describing the evolution of a system where customers arrive, are queued and served, such that
The initial number of customers in the queue is an integer \(I\), \(Q_0 = I\),
Customers arrive in the manner of a Poisson process with rate \(\lambda > 0\).
The time taken to serve each customer is exponentially distributed with parameter \(\mu > 0\), and service times are independent of each other.
The inter-arrival times and service times are all independent variables.
Theorem (Steady state of the first come, first served queue) Let \(Q_t\) be a first come, first served queue with arrival rate \(\lambda\) and service rate \(\mu > 0\). If \(\lambda < \mu\), the queue has a unique steady-state distribution given by
independent of \(Q_0\). If \(\lambda \geq \mu\), there is no steady-state distribution.
Proof: Steady state of the first come, first served queue
Let \(Q_t\) be a first come, first served queue with arrival rate \(\lambda\) and service rate \(\mu > 0\), and write \(p_k(t) = \mathbb{P}(Q_t = k)\). Then
Now assume there exists a steady state
it will satisfy \(p_k'(t) = 0\), so
Therefore \(\pi_1 = \rho \pi_0\) where \(\rho = \frac{\lambda}{\mu}\). Continuing recursively
arriving at \(\pi_k = \rho^k \pi_0\). If \(\rho < 1\) then
arriving at the result
On the other hand, if \(\rho \geq 1\) \(\pi_k\) cannot be a normalised distribution, contradicting our assumption that a steady state exists.