Models with two unknown parameters
If a family of distributions has a single unknown parameter, \(\theta\), the method of moments estimate makes the distribution's mean equal to the mean of a random sample — by solving
\[ \mu(\theta) \;\;=\;\; \overline{x} \]However many families of standard distributions involve two unknown parameters. The method of moments can be extended to models with two parameters.
Definition
If a distribution has two unknown parameters, \(\theta\) and \(\phi\), the method of moments estimates of the parameters are found by solving
\[ \mu(\theta, \phi) \;=\; \overline{x} \spaced{and} \sigma^2(\theta, \phi) \;=\; s^2 \]where \(\mu(\theta, \phi)\) and \(\sigma^2(\theta, \phi)\) are the mean and variance of the distribution and \(\overline{x}\) and \(s^2\) are the mean and variance of a random sample from it.
Our first example is almost trivially simple.
Normal distribution
If \(X\) has a normal distribution,
\[ X \;\;\sim\;\; \NormalDistn(\mu, \sigma^2) \]then the method of moments estimates of its parameters are
\[ \hat{\mu} = \overline{x} \spaced{and} \hat{\sigma}^2 = s^2 \]The next example is a little harder.
Question: Negative binomial distribution
If \(X\) has a generalised negative binomial distribution,
\[ X \;\;\sim\;\; \NegBinDistn(\kappa, \pi) \]what are the method of moments estimates of \(\kappa\) and \(\pi\) from a random sample?
(Solved in full version)
Three or more parameters
Unfortunately the method of moments does not extend easily to models with three or more parameters.