Sum of independent exponential variables
In a homogeneous Poisson process, the time until the \(k\)'th event is
\[ X \;\; = \; \; \sum_{i=1}^k {Y_i} \]where \(Y_1\) is the time to the first event, \(Y_2\) is the time from the first event to the second, and so on. From the memoryless property of a homogeneous Poisson process, the \(\{Y_i\}\) are independent and they all have \(\ExponDistn(\lambda)\) distributions.
Since \(X\) is the sum of a random sample of size \(k\) from an exponential distribution, we can use general results about the sum of a random sample to find its mean and variance.
Mean and variance of Erlang distribution
If a random variable, \(X\), has an Erlang distribution with probability density function
\[ f(x) \;\;=\;\; \begin{cases} \dfrac{\lambda^k}{(k-1)!} x^{k-1} e^{-\lambda x} & x \gt 0 \\[0.3em] 0 & \text{otherwise} \end{cases} \]its mean and variance are
\[ E[X] \;=\; \frac k{\lambda}\spaced{and} \Var(X) \;=\; \frac k{\lambda^2} \](Proved in full version)
A final useful property of Erlang distributions that adding together two independent Erlang variables (with the same \(\lambda\)) results in a variable that also has an Erlang distribution.
Additive property of Erlang distributions
If \(X_1 \sim \ErlangDistn(k_1,\; \lambda)\) and \(X_2 \sim \ErlangDistn(k_2,\; \lambda)\) are independent, then
\[ X_1 + X_2 \;\;\sim\;\; \ErlangDistn(k_1 + k_2,\; \lambda) \](Proved in full version)