We now give formulae for the mean and variance of the Gamma distribution.

Mean and variance

If a random variable \(X\) has a Gamma distribution with probability density function

\[ f(x) \;\;=\;\; \begin{cases} \dfrac {\beta^\alpha }{\Gamma(\alpha)} x^{\alpha - 1} e^{-x\beta}& \text{if }x \gt 0 \\ 0 & \text{otherwise} \end{cases} \]

then its mean and variance are

\[ E[X] \;=\; \frac{\alpha}{\beta} \spaced{and} \Var(X) \;=\; \frac{\alpha}{\beta^2} \]

The mean is

\[ \begin{align} E[X] \;&=\; \int_0^{\infty} {x \frac {\beta^\alpha }{\Gamma(\alpha)} x^{\alpha - 1} e^{-x\beta}} \;dx \\[0.4em] &=\; \frac {\beta^\alpha}{\Gamma(\alpha)} \int_0^{\infty} {x^{\alpha} e^{-x\beta}} \;dx \end{align} \]

We integrate this with the change of variable

\[ y = \beta x \spaced{and} dy = \beta \;dx\]

so

\[ \begin{align} E[X] \;&=\; \frac 1{\beta\; \Gamma(\alpha)} \int_0^{\infty} {y^{\alpha} e^{-y}} \;dy \\[0.4em] &=\; \frac {\Gamma(\alpha + 1)}{\beta\; \Gamma(\alpha)} \;=\; \frac {\alpha}{\beta} \end{align} \]

In a similar way,

\[ E[X^2] \;=\; \frac {\Gamma(\alpha + 2)}{\beta^2 \; \Gamma(\alpha)} \;=\; \frac {\alpha(\alpha + 1)}{\beta^2} \]

so

\[ \Var(X) \;=\; E[X^2] - \left(E[X]\right)^2 \;=\; \frac {\alpha(\alpha + 1)}{\beta^2} - \frac{\alpha^2}{\beta^2} \;=\; \frac{\alpha}{\beta^2} \]

We explained earlier that the sum of independent \(\ErlangDistn(k_1,\; \lambda)\) and \(\ErlangDistn(k_2,\; \lambda)\) random variables has an \(\ErlangDistn(k_1 + k_2,\; \lambda)\) distribution. The same holds for the sum of Gamma random variables, provided their second parameters are equal.

Additive property of Gamma distributions

If \(X_1 \sim \GammaDistn(\alpha_1,\; \beta)\) and \(X_2 \sim \GammaDistn(\alpha_2,\; \beta)\) are independent, then

\[ X_1 + X_2 \;\;\sim\;\; \GammaDistn(\alpha_1 + \alpha_2,\; \beta) \]

We justified this earlier for integer values of \(\alpha\) in the context of the Erlang distribution, but cannot give the general proof here.

The following result gives a normal approximation to the Gamma distribution when \(\alpha\) is large.

Asymptotic normal distribution

The shape of the \(\GammaDistn(\alpha,\; \beta)\) distribution approaches that of a normal distribution as \(\alpha \to \infty\)

If \(\{X_1, X_2, \dots\}\) are independent \(\GammaDistn(1,\; \beta)\) distribution, then \(\sum_{i=1}^{n}X_i \sim \GammaDistn(n,\; \beta)\). From the Central Limit Theorem, the distribution of this sum approaches a normal distribution as \(n \to \infty\).