Lecture 9. C) Minimum Variance Estimators

From Significant Statistics
Jump to navigation Jump to search

Minimum Variance Estimators

We usually focus our attention on unbiased estimators. Those that, one average, produce the correct result.

The collection of unbiased estimators is

[math]W_{u}=\left\{ w:\,Bias_{\theta}\left(w\right)=0,\,Var_{\theta}\left(w\right)\lt \infty,\,\forall\theta\in\Theta\right\}[/math].

So, if [math]\widehat{\theta}\in W_{u}[/math], then [math]MSE_{\theta}\left(\widehat{\theta}\right)=Var_{\theta}\left(\widehat{\theta}\right).[/math]

We can now define a type of minimum variance estimator:

An estimator [math]\widehat{\theta}\in W_{u}[/math] of [math]\theta[/math] is a uniform minimum-variance unbiased (UMVU) estimator of [math]\theta[/math] if it is efficient relative to [math]W_{u}[/math].

The minimum-variance unbiased part of UMVU should be clear. Of the unbiased estimators, [math]\widehat{\theta}[/math] is “MVU” if it achieves the lowest variance and is unbiased. The “uniform” part simply means that [math]\widehat{\theta}[/math] is unbiased and minimum variance for all values that [math]\theta[/math] may hold. It is MVU if [math]\theta=4[/math], and if [math]\theta=-3[/math], etc.

It is often possible to identify UMVU estimators. The tool to do this is the Rao-Blackwell theorem. Before we do so, we need to introduce an additional concept.