The mean of a \(\ChiSqrDistn(k\;\text{df})\) random variable can be derived from its definition as the sum of \(k\) squared standard normal variables,

\[ E[Y] \;=\; E\left[\sum_{i=1}^k {Z_i^2}\right] \;=\; \sum_{i=1}^k E\left[Z_i^2\right] \;=\; \sum_{i=1}^k \Var(Z) \;=\; k \]

However it is easier to find its mean and variance from its equivalence to a Gamma distribution.

Mean and variance

If \(Y \sim \ChiSqrDistn(k\;\text{df})\), its mean and variance are

\[ E[Y] = k \spaced{and} \Var(Y) = 2k\]

The mean and variance of a \(\GammaDistn(\alpha,\; \beta)\) distribution are

\[ E[Y] \;=\; \frac{\alpha}{\beta} \spaced{and} \Var(Y) \;=\; \frac{\alpha}{\beta^2} \]

A \(\ChiSqrDistn(k)\) distribution is equivalent to a \(\GammaDistn(\frac k 2, \frac 1 2)\) distribution, so its mean and variance can be found by replacing \(\alpha = \frac k 2\) and \(\beta = \frac 1 2\).

Shape of the distribution

The \(\ChiSqrDistn(k\;\text{df})\) distribution is skew with a longer tail to the right. However since we defined the distribution to be that of

\[ Y = \sum_{i=1}^k {Z_i^2} \]

where the \(\{Z_i^2\}\) are independent and identically distributed with \(\ChiSqrDistn(1)\) distributions, the Central Limit Theorem shows that the distribution approaches the shape of a Normal distribution as \(k \to \infty\).

Shape of the distribution

The diagram below shows the chi-squared distribution for different values of \(k\), its degrees of freedom.

Note the following properties of the distribution.