Finding a good parameter estimate
The previous section described the characteristics of a good estimator of an unknown parameter, but did not provide a general method for finding such an estimator.
We now introduce a simple method for estimating a parameter \(\theta\), based on a random sample from a distribution whose shape depends on \(\theta\). If the distribution's mean depends on \(\theta\), we adjust the value of \(\theta\) to make the distribution's mean equal to the mean of the random sample, \(\overline{X}\).
Definition
If \(\{X_1, X_2, \dots, X_n\}\) is a random sample from a distribution whose mean, \(\mu(\theta)\), depends on an unknown parameter, \(\theta\), the method of moments estimator of \(\theta\) is the solution to the equation
\[ \mu(\theta) = \overline{X} \]We now illustrate this with a simple example.
Estimating a normal distribution's mean (\(\sigma\) known)
The histogram below shows a random sample from a normal distribution whose standard deviation, \(\sigma\), is a known value.
\[ X \;\; \sim \; \; \NormalDistn(\mu,\; \sigma=0.6) \]The method of moments estimator of \(\mu\) is the value that makes the normal distribution's mean equal to that of the data set. Since the distribution's mean is \(\mu\), this simply results in the method of moments estimate being
\[ \hat{\mu} = \overline{x} \]