We now give formulae for the mean and variance of the Gamma distribution.

Mean and variance

If a random variable \(X\) has a Gamma distribution with probability density function

\[ f(x) \;\;=\;\; \begin{cases} \dfrac {\beta^\alpha }{\Gamma(\alpha)} x^{\alpha - 1} e^{-x\beta}& \text{if }x \gt 0 \\ 0 & \text{otherwise} \end{cases} \]

then its mean and variance are

\[ E[X] \;=\; \frac{\alpha}{\beta} \spaced{and} \Var(X) \;=\; \frac{\alpha}{\beta^2} \]

(Proved in full version)

The sum of independent \(\ErlangDistn(k_1,\; \lambda)\) and \(\ErlangDistn(k_2,\; \lambda)\) random variables has an \(\ErlangDistn(k_1 + k_2,\; \lambda)\) distribution, and the same holds for the sum of Gamma random variables, provided their second parameters are equal.

Additive property of Gamma distributions

If \(X_1 \sim \GammaDistn(\alpha_1,\; \beta)\) and \(X_2 \sim \GammaDistn(\alpha_2,\; \beta)\) are independent, then

\[ X_1 + X_2 \;\;\sim\;\; \GammaDistn(\alpha_1 + \alpha_2,\; \beta) \]

(Not proved)

The Central Limit Theorem can be used to give a normal approximation to the Gamma distribution when \(\alpha\) is large.

Asymptotic normal distribution

The shape of the \(\GammaDistn(\alpha,\; \beta)\) distribution approaches that of a normal distribution as \(\alpha \to \infty\)

(Proved in full version)