# Minimum Variance Estimators

We usually focus our attention on unbiased estimators. Those that, one average, produce the correct result.

The collection of unbiased estimators is

$W_{u}=\left\{ w:\,Bias_{\theta}\left(w\right)=0,\,Var_{\theta}\left(w\right)\lt \infty,\,\forall\theta\in\Theta\right\}$.

So, if $\widehat{\theta}\in W_{u}$, then $MSE_{\theta}\left(\widehat{\theta}\right)=Var_{\theta}\left(\widehat{\theta}\right).$

We can now define a type of minimum variance estimator:

An estimator $\widehat{\theta}\in W_{u}$ of $\theta$ is a uniform minimum-variance unbiased (UMVU) estimator of $\theta$ if it is efficient relative to $W_{u}$.

The minimum-variance unbiased part of UMVU should be clear. Of the unbiased estimators, $\widehat{\theta}$ is “MVU” if it achieves the lowest variance and is unbiased. The “uniform” part simply means that $\widehat{\theta}$ is unbiased and minimum variance for all values that $\theta$ may hold. It is MVU if $\theta=4$, and if $\theta=-3$, etc.

It is often possible to identify UMVU estimators. The tool to do this is the Rao-Blackwell theorem. Before we do so, we need to introduce an additional concept.