We now give two applications of the delta method.
Question: Estimator of a geometric distribution's parameter, π
If \(X \sim \GeomDistn(\pi)\), with probability function
\[ p(x) = \pi (1-\pi)^{x-1} \quad \quad \text{for } x = 1, 2, \dots \]the method of moments estimator of \(\pi\) and its maximum likelihood estimator are both the inverse of the sample mean,
\[ \hat{\pi} \;\;=\;\; \dfrac 1{\overline{X}} \]Use the Delta method to find the approximate mean and variance of this estimator.
(Solved in full version)
In this example, the delta method gives the same approximate standard error as would be found using the second derivative of the log-likelihood, but approximations from the two methods are not always equal.
Odds
Uncertainty is often described by probability, but the chance of an event happening can alternatively be described by its odds.
Definition
The odds for an event are the ratio of the probability of the event happening to the probability of it not happening,
\[ \operatorname{odds}(E) \;\;=\;\; \frac{P(E)}{1 - P(E)} \]Note that whereas probabilities must be between 0 and 1, the odds of an event can be greater than 1.
Question: Odds of success
In a series of \(n\) independent success/failure trials that each have odds \(\theta\) of success, \(x\) successes are observed. What is the maximum likelihood estimator of \(\theta\)? If \(n\) is large, what is its approximate standard error?
(Solved in full version)