A simple way to obtain an estimate of a single unknown parameter from a random sample is the method of moments. For both discrete and continuous distributions, it is the parameter value that makes the distribution's mean equal to that of the random sample and is therefore the solution to the equation

\[ E[X] \;\; = \; \; \overline{X} \]

German tank problem

Consider a rectangular distribution,

\[ X \;\; \sim \; \; \RectDistn(0, \beta) \]

where the upper limit, \(\beta\), is an unknown parameter. The distribution's mean and variance are

\[ E[X] \;\; = \; \; \frac {\beta} 2 \spaced{and} \Var(X) = \frac {\beta^2} {12} \]

so the method of moments estimator is

\[ \hat{\beta} \;\;=\;\; 2\overline{X}\]

It is unbiased and has standard error

\[ \se(\hat{\beta}) \;\;=\;\; \sqrt{\Var(2\overline{X})} \;\;=\;\; \sqrt{ \frac {4\Var(X)} n } \;\;=\;\; \frac {\beta} {\sqrt{3n}} \]

Despite being unbiased, this estimator has one major problem. From the random sample {12,17, 42, 97}, the resulting estimate of \(\beta\) would be

\[ \hat{\beta} \;\;=\;\; 2\overline{X} \;\;=\;\; 84\]

yet the maximum of the distribution cannot be 84 since we have already observed one value greater than this.

The method of moments usually gives reasonable parameter estimates, but can sometimes result in estimates that are not feasible.