A good estimator should have a small bias and small standard error. These two criteria can be combined with into single value called the estimator's mean squared error.

Definition

The mean squared error of an estimator \(\hat{\theta}\) of a parameter \(\theta\) is

\[ \MSE(\hat{\theta}) = E\left[ (\hat{\theta} - \theta)^2 \right] \]

Its relationship to our earlier definitions of bias and standard error is given by the following result.

Mean squared error

The mean squared error of an estimator \(\hat{\theta}\) of \(\theta\) is

\[ \MSE(\hat{\theta}) = \Var(\hat{\theta}) + \Bias(\hat{\theta})^2 \]

(Proved in full version)