Number of events in a fixed time period

In a homogeneous Poisson process with events that occur at a rate of \(\lambda\) per unit time, we now define a discrete random variable, \(X\), to be the number of events that occur in one unit of time.

If the unit of time is split into infinitessimally small periods, each of length \(\delta t\), the requirements of a homogeneous Poisson process mean that

  1. No more than one event can occur in each period.
  2. The occurrences of events in different small periods are independent.
  3. The probability of an event in any interval is \(\lambda \times \delta t\).

Binomial approximation

To derive the probability function of \(X\), we start with a situation in which the above three conditions hold, but \(\delta t\) is larger. If unit time is split into \(n\) intervals, each of length \(\frac 1 n\), the three conditions would mean that the number of events is the number of successes in a series of \(n\) independent Bernoulli trials, each with probability \(\pi = \frac {\lambda} n\) of success, so

\[ X \;\sim\; \BinomDistn \left(n, \pi = \frac {\lambda} n\right) \]

The probability function for the number of events in a homogeneous Poisson process can be found as the limit of this, as \(n \to \infty\).

Process with λ = 3 events per unit time

The diagram below initially splits the unit time period into 5 periods of length \(\frac 1 5\). We initially only allow a single event in each of these periods, with probability \(\frac 3 5\) of an event with each. The total number of events therefore has a \(\BinomDistn(n=5, \pi=\frac 3 5) \) distribution, as shown in the bar chart at the bottom. The expected value of this variable is 3, corresponding to a rate of 3 per unit time.

Use the pop-up menu to split unit time into 10 periods. Halving the length of each period means that its probability of an event is also halved to \(\frac 3 {10}\), giving a \(\BinomDistn(n=10, \pi=\frac 3 {10}) \) distribution for the total number of events.

With a finer splitting of unit time into 160 periods, the probability of an event in each of these periods would be \(\frac 3 {160}\), giving a \(\BinomDistn(n=160, \pi=\frac 3 {160}) \) distribution for the total number of events. The expected number of events remains at 3, corresponding to a rate of \(\lambda = 3\) events per unit time.

The distribution of \(X\), the number of events in a Poisson process, is the limit of this as the unit time period is split into finer and finer intervals, allowing events to occur arbitrarily close in time, but not simultaneously.

\[ X \;\; \sim \; \; \lim_{k \to \infty} \BinomDistn\left(k,\; \frac 3 k\right) \]

We now find the probability function for the number of events in a Poisson process, \(X\), as the limit of the probability functions of these binomial distributions.

Poisson probability function

The number of events in unit time in a Poisson process with rate \(\lambda\) per unit time has probability function

\[ p(x) \;\;=\;\; \frac {\lambda^x e^{-\lambda}} {x!} \quad\quad \text{ for } x=0, 1, \dots \]

We use the fact that

\[ X \;\; \sim \; \; \lim_{k \to \infty} \BinomDistn\left(k,\; \frac {\lambda} k\right) \]

The probability function is therefore the limit of these binomial probability functions,

\[ p(x) \;\;=\;\; \lim_{k \to \infty} p_k(x) \]

where

\[ \begin{align} p_k(x) \;\;&=\;\; {k \choose x} \left(\frac {\lambda} k \right)^x \left(1 - \frac {\lambda} k \right)^{k-x} \\ &=\;\; \frac{k!} {(k-x)!\;x!} \times \frac {\lambda^x} {k^x} \left(1-\frac {\lambda} k \right)^{-x} \left(1-\frac {\lambda} k \right)^{k} \\[0.4em] &=\;\; \frac{k(k-1)\dots(k-x+1)} {k^x} \times \left(1-\frac {\lambda} k \right)^{-x} \frac {\lambda^x} {x!} \left(1-\frac {\lambda} k \right)^{k} \\[0.3em] &=\;\; \left(1 - \frac 1 k\right) \left(1 - \frac 2 k\right) \dots \left(1 - \frac {x-1} k\right) \times \left(1-\frac {\lambda} k \right)^{-x} \frac {\lambda^x} {x!} \left(1-\frac {\lambda} k \right)^{k} \end{align} \]

Since the limits of the initial terms as \(k \to \infty\) are all one,

\[ p(x) \;\;=\;\; \frac {\lambda^x} {x!} \times \lim_{k \to \infty} {\left(1-\frac {\lambda} k \right)^{k}} \]

You may recognise (from maths) that the limit on the right is \(e^{-\lambda}\). Otherwise we can find the limit using the fact that the probability function of any discrete random variable sums to one, so

\[ \begin{align} \sum_{x=0}^{\infty}{ p(x)} \;\;&=\;\; \sum_{x=0}^{\infty} {\frac {\lambda^x} {x!}} \times \lim_{k \to \infty} {\left(1-\frac {\lambda} k \right)^{k}} \\ &=\;\; e^{\lambda} \times \lim_{k \to \infty} {\left(1-\frac {\lambda} k \right)^{k}} \quad\quad \text{(sum of exponential series)} \\ &=\;\;1 \end{align} \]

A distribution of this form is called a Poisson distribution.