Before finding the mean and variance of a general normal distribution, we derive those of the standard normal distribution.

Mean and variance of standard normal distribution

If \(Z \sim \NormalDistn(0,\; 1)\), its mean and variance are

\[ E[Z] \;=\; 0 \spaced{and} \Var(Z) \;=\; 1 \]

A standard normal distribution's mean is

\[ \begin{align} E[Z] \;&=\; \int_{-\infty}^{\infty} z \times \frac 1{\sqrt{2\pi}} e^{- \frac{\large z^2}{\large 2}} dz \\ &=\; \frac 1{\sqrt{2\pi}} \left(\int_0^{\infty} z \;e^{- \frac{\large z^2}{\large 2}} dz \;+ \int_{-\infty}^0 z \;e^{- \frac{\large z^2}{\large 2}} dz \right) \end{align} \]

With a change of variable \(w = -z\) and \(dw = -dz\) in the second integral,

\[ E[Z] \;=\; \frac 1{\sqrt{2\pi}} \left(\int_0^{\infty} z \;e^{- \frac{\large z^2}{\large 2}} dz \;- \int_0^{\infty} w \;e^{- \frac{\large w^2}{\large 2}} dw \right) \]

The two integrals in the brackets are the same but before we can cancel them, we should show that they are finite. A change of variable \(v = \frac {z^2}2\) lets us evaluate the lefthand integral. Since \(z = \sqrt{2v}\) and \(dv = zdz\),

\[ \int_0^{\infty} z \;e^{- \frac{\large z^2}{\large 2}} dz \;=\; \int_0^{\infty} e^{-v} dv \;=\; 1 \]

To find the variance of \(Z\),

\[ \begin{align} \Var(Z) \;=\; E[Z^2] - \left(E[Z]\right)^2 \;&=\; E[Z^2] \\[0.4em] &=\; \int_{-\infty}^{\infty} z^2 \times \frac 1{\sqrt{2\pi}}\;e^{- \frac{\large z^2}{\large 2}} dz \\ &=\; \frac 2{\sqrt{2\pi}} \int_0^{\infty} z^2 \;e^{- \frac{\large z^2}{\large 2}} dz \end{align} \]

With the same change of variable as above,

\[ \begin{align} \Var(Z) \;&=\; \frac 2{\sqrt{2\pi}} \int_0^{\infty} \sqrt{2} \times w^{1.5} \;e^{-w} dw \\ &=\; \frac 2{\sqrt{2\pi}} \times \sqrt{2} \times \Gamma(1.5) \\ &=\; \frac 2{\sqrt{\pi}} \times 0.5 \times \Gamma(0.5) \\ &=\; \frac 1{\sqrt{\pi}} \times \sqrt{\pi} \;=\; 1 \end{align} \]

The mean and variance of a general normal distribution can be found from this result.

Mean and variance of a general normal distribution

If \(X \sim \NormalDistn(\mu,\; \sigma^2)\), its mean and variance are

\[ E[X] \;=\; \mu \spaced{and} \Var(X) \;=\; \sigma^2 \]

The normal distribution's mean is

\[ E[X] \;=\; \int_{-\infty}^{\infty} x \times \frac 1{\sqrt{2\pi}\;\sigma} e^{- \frac{\large (x-\mu)^2}{\large 2 \sigma^2}} dx \]

With a change of variable \(z = \frac{x-\mu}{\sigma}\) so \(x = \mu + z\sigma\) and \(dz = \frac{dx}{\sigma}\), this is

\[ \begin{align} E[X] \;&=\; \int_{-\infty}^{\infty} (\mu + z\sigma) \times \frac 1{\sqrt{2\pi}} e^{- \frac{\large z^2}{\large 2}} dz \\ &=\; \mu \int_{-\infty}^{\infty} \frac 1{\sqrt{2\pi}} e^{- \frac{\large z^2}{\large 2}} dz \;+\; \sigma \int_{-\infty}^{\infty} z \frac 1{\sqrt{2\pi}} e^{- \frac{\large z^2}{\large 2}} dz \end{align} \]

The first integral is that of the standard normal distribution's pdf and is therefore one. The second integral is the same as the formula for the standard normal distribution's mean which was shown earlier to be zero, so \(E[X] = \mu\).

The proof for the variance is done in a similar way.

\[ \begin{align} \Var(X) = E\left[(X-\mu)^2\right] \;&=\; \int_{-\infty}^{\infty} (x-\mu)^2 \times \frac 1{\sqrt{2\pi}\;\sigma} e^{- \frac{\large (x-\mu)^2}{\large 2 \sigma^2}} dx \\ &=\; \int_{-\infty}^{\infty} (z^2\sigma^2) \times \frac 1{\sqrt{2\pi}} e^{- \frac{\large z^2}{\large 2}} dz \\ &=\; \sigma^2 \int_{-\infty}^{\infty} z^2 \frac 1{\sqrt{2\pi}} e^{- \frac{\large z^2}{\large 2}} dz \end{align} \]

This integral is the same as the one for the variance of the standard normal distribution and is therefore one, completing the proof for the variance.

These results explain why the symbols "\(\mu\)" and "\(\sigma^2\)" are used for the normal distribution's two parameters.