Lecture 12. E) Example: LRT

From Significant Statistics
Jump to navigation Jump to search

Example: LRT, Normal

Suppose [math]X_{i}\overset{iid}{\sim}N\left(\mu,1\right)[/math], and the test problem is

[math]H_{0}:\mu=0\,vs.\,H_{1}:\mu\neq0[/math]

The LRT can be written as

[math] 2\left[\max_{\theta}\,\left\{ l\left(\left.\theta\right|X_{1}..X_{n}\right)\right\} -l\left(\left.\theta_{0}\right|X_{1}..X_{n}\right)\right]\\ =2\left(\max_{\theta}\,\left\{ \sum_{i=1}^{n}\log\left[\frac{1}{\sqrt{2\pi}}\exp\left(-\frac{\left(x_{i}-\theta\right)^{2}}{2}\right)\right]\right\} -\sum_{i=1}^{n}\log\left[\frac{1}{\sqrt{2\pi}}\exp\left(-\frac{\left(x_{i}-\theta_{0}\right)^{2}}{2}\right)\right]\right)\\ =2\left(\max_{\theta}\,\left\{ \sum_{i=1}^{n}-\frac{\left(x_{i}-\theta\right)^{2}}{2}\right\} +\frac{\left(x_{i}-\theta_{0}\right)^{2}}{2}\right)[/math].

To continue, first notice that

[math]\begin{aligned} & \frac{\partial}{\partial\theta}\sum_{i=1}^{n}-\frac{\left(x_{i}-\theta\right)^{2}}{2}=0\\ \Leftrightarrow & -2\sum_{i=1}^{n}\left(x_{i}-\theta\right)=0\\ \Leftrightarrow & \sum_{i=1}^{n}x_{i}=n\theta\\ \Rightarrow & \widehat{\theta}_{ML}=\frac{1}{n}\sum_{i=1}^{n}x_{i}=\overline{x}\end{aligned}[/math]

s.t.

[math]\begin{aligned} & 2\left(\max_{\theta}\,\left\{ \sum_{i=1}^{n}-\frac{\left(x_{i}-\theta\right)^{2}}{2}\right\} +\frac{\left(x_{i}-\theta_{0}\right)^{2}}{2}\right)\\ = & 2\left(\sum_{i=1}^{n}-\frac{\left(x_{i}-\overline{x}\right)^{2}}{2}+\frac{\left(x_{i}-\theta_{0}\right)^{2}}{2}\right)\\ = & \sum_{i=1}^{n}-x_{i}^{2}+2x_{i}\overline{x}-\overline{x}^{2}+x_{i}^{2}-2x_{i}\theta_{0}+\theta_{0}^{2}\\ = & n\left(\theta_{0}^{2}-2\overline{x}\theta_{0}+\overline{x}^{2}\right)+\sum_{i=1}^{n}-x_{i}^{2}+2x_{i}\overline{x}-2\overline{x}^{2}+x_{i}^{2}\\ = & n\left(\theta_{0}^{2}-2\overline{x}\theta_{0}+\overline{x}^{2}\right)+2n\overline{x}^{2}-2n\overline{x}^{2}\\ = & n\left(\overline{x}-\theta_{0}\right)^{2}\end{aligned}[/math]

So, the LRT becomes “reject [math]H_{0}[/math] iff [math]n\left(\overline{x}-\theta_{0}\right)^{2}\gt c[/math].

One can verify that the LM and Wald approaches yield the same test (We will show this later.)

[math]\chi^{2}[/math] Distribution

Since we know that [math]\overline{X}[/math] follows a normal distribution, and we also know that the sum of squares of standard normal distributions follows a chi-square distribution, let us try to derive the distribution of our test statistic.

First, notice that under [math]H_{0}[/math], [math]\overline{X}\sim N\left(\theta_{0},\frac{1}{n}\right)[/math] such that [math]\overline{X}-\theta_{0}\sim N\left(0,\frac{1}{n}\right)[/math] and [math]\sqrt{n}\left(\overline{X}-\theta_{0}\right)\sim N\left(0,1\right)[/math].

Define [math]Z=\sqrt{n}\left(\overline{X}-\theta_{0}\right)[/math], and notice that [math]Z^{2}[/math] is our original test statistic, such that [math]n\left(\overline{x}-\theta_{0}\right)^{2}\sim\chi_{\left(1\right)}^{2}[/math] where the [math]\left(1\right)[/math] subscript is the number of squared normals summed in our statistic.

It is often challenging to calculate the distribution of test statistics. In these cases, computers come in handy (we will cover this later).

For completeness only, the density of the chi-square distribution is given by

[math]f_{Z^{2}}\left(\left.z\right|k\right)=\frac{1}{2^{\frac{k}{2}}\Gamma\left(\frac{k}{2}\right)}x^{\left(\frac{k}{2}-1\right)\exp\left(-\frac{x}{2}\right)}1\left(x\gt 0\right)[/math].