We now give an example.

Example: Normal distribution

We now consider a random sample, \(\{X_1, \dots, X_n\}\) from a \(\NormalDistn(\mu, \sigma^2)\) distribution. The distribution's probability density function is

\[ f(x) \;\;=\;\; \frac 1{\sqrt{2\pi}\;\sigma} e^{- \frac{\Large (x-\mu)^2}{\Large 2 \sigma^2}} \]

and its logarithm is

\[ \log f(x) \;\;=\;\; -\frac 1 2 \log(\sigma^2) - \frac{(x-\mu)^2}{2 \sigma^2} - \frac 1 2 \log(2\pi) \]

The log-likelihood function is therefore

\[ \ell(\mu, \sigma^2) \;\;=\;\; \sum_{i=1}^n {\log f(x_i)} \;\;=\;\; -\frac n 2 \log(\sigma^2) - \frac{\sum_{i=1}^n {(x_i-\mu)^2}}{2 \sigma^2} - \frac n 2 \log(2\pi) \]

To get the maximum likelihood estimates, we therefore solve

\[ \frac{\partial \ell(\mu, \sigma^2)}{\partial \mu} \;\;=\;\; \frac{\sum{(x_i - \mu)}}{\sigma^2} \;\;=\;\; 0 \]

and

\[ \frac{\partial \ell(\mu, \sigma^2)}{\partial \sigma^2} \;\;=\;\; -\frac n {2 \sigma^2} + \frac{\sum_{i=1}^n {(x_i-\mu)^2}}{2 \sigma^4} \;\;=\;\; 0 \]

Solving gives

\[ \begin{align} \hat{\mu} \;&=\; \overline{x} \\[0.2em] \hat{\sigma}^2 \;&=\; \frac {\sum{(x_i-\overline{x})^2}} n \end{align} \]

Note that the MLE of \(\sigma^2\) is biased. The sample variance, \(S^2\), divides by \((n-1)\) instead of \(n\) — it is unbiased and is usually prefered.