# Lecture 8. A) Point Estimation

# Point Estimation

Let [math]X_{1}..X_{n}[/math] be a random sample from a distribution with cdf [math]F\left(\left.\cdot\right|\theta\right)[/math] where [math]\theta\in\Theta[/math] is unknown.

A **point estimator** is **any function** [math]\omega\left(X_{1}..X_{n}\right)[/math].

Notice that a point estimator is a statistic. This does mean that it too is a random variable: For different random samples, we will obtain different point estimators.

We refer to the realized value of an estimator (i.e., the value of the statistic applied to the realized values of a random sample) as an estimate.

Clearly, a good estimator will be close to [math]\theta[/math] in some probabilistic sense. Finally, an estimator cannot use the true value of [math]\theta[/math] itself.

We consider two methods for point estimation: The method of moments and maximum likelihood.