The maximum likelihood estimate of a parameter \(\theta\) is usually a value that satisfies the equation
\[ \ell'(\theta) \;\; = \;\; 0 \]where \(\ell(\theta)\) is the log-likelihood function. Sometimes this equation cannot be solved algebraically, so an iterative numerical method is required to obtain the maximum likelihood estimate.
One way to solve an equation numerically is called the Newton Raphson algorithm. Consider an equation
\[ g(\theta) \;\; = \;\; 0 \]Newton Raphson algorithm
Starting at an initial guess of the solution, \(\theta_0\), successive values
\[ \theta_{i+1} \;\; = \;\; \theta_i - \frac {g(\theta_i)} { g'(\theta_i)} \qquad \text{for } i=0,\dots\]are called the Newton Raphson algorithm. If it converges, it is to a solution of the equation \(g(\theta) = 0\).
This is justified by a Taylor series expansion of \(g(\theta)\) around \(\theta_0\).
Applying the algorithm to maximum likelihood
To apply it to maximum likelihood, we use the function \(g(\theta) = \ell'(\theta)\). The Newton Raphson algorithm can therefore be expressed as
\[ \theta_{i+1} \;\; = \;\; \theta_i - \frac {\ell'(\theta_i)} { \ell''(\theta_i)}\]This usually converges to the maximum likelihood estimate, provided the initial guess, \(\theta_0\) is not too far from the correct value. The algorithm may need to be used from various starting values until one is found for which the algorithm converges.