Two independent normal variables

For any two independent random variables, \(X\) and \(Y\), with means \(\mu_X\) and \(\mu_Y\) and variances \(\sigma_X^2\) and \(\sigma_Y^2\),

\[ \begin {align} E[aX + bY] & = a\mu_X + b\mu_Y \\[0.5em] \Var(aX + bY) & = a^2\sigma_X^2 + b^2\sigma_Y^2 \end {align} \]

When \(X\) and \(Y\)have normal distributions, we can be more precise about the distribution's shape.

Linear function of independent normal variables

If \(X\) and \(Y\) are independent random variables,

\[ \begin {align} X \;&\sim\; \NormalDistn(\mu_X,\; \sigma_X^2) \\ Y \;&\sim\; \NormalDistn(\mu_Y,\; \sigma_Y^2) \end {align} \]

then

\[ aX + bY \;\sim\; \NormalDistn(a\mu_X + b\mu_Y,\; a^2\sigma_X^2 + b^2\sigma_Y^2) \]

Random sample

This can be extended to the sum of values in a normal random sample.

Sum of a random sample

If \(\{X_1, X_2, ..., X_n\}\) is a random sample of n values from a \(\NormalDistn(\mu,\; \sigma^2)\) distribution then,

\[ \sum_{i=1}^n {X_i} \;\sim\; \NormalDistn(n\mu,\; n\sigma^2) \]

(Proved in full version)

A similar result holds for the mean of a random sample from a normal distribution.

Mean of a random sample

If \(\{X_1, X_2, ..., X_n\}\) is a random sample of n values from a \(\NormalDistn(\mu,\; \sigma^2)\) distribution then,

\[ \overline{X} \;\sim\; \NormalDistn\left(\mu,\; \frac {\sigma^2}{n}\right) \]