Lecture 13. B) Example: Normal
Example: Normal
Suppose [math]X_{i}\overset{iid}{\sim}N\left(\mu,\sigma^{2}\right)[/math] where [math]\sigma^{2}[/math] is known. Let us derive the UMP level [math]\alpha[/math] test of
[math]H_{0}:\mu=\mu_{0}[/math] vs. [math]H_{1}:\mu=\mu_{1}\gt \mu_{0}[/math] for a given [math]\mu_{1}[/math] at a time.
The Neyman-Pearson lemma suggests a test of the form [math]\frac{f\left(\left.x\right|\theta_{1}\right)}{f\left(\left.x\right|\theta_{0}\right)}\gt k[/math]. With some algebra, the left-hand side yields:
[math]\begin{aligned} \frac{\frac{1}{\sqrt{2\pi\sigma^{2}}}\exp\left(-\sum_{i=1}^{n}\frac{\left(x_{i}-\mu_{1}\right)^{2}}{2\sigma^{2}}\right)}{\frac{1}{\sqrt{2\pi\sigma^{2}}}\exp\left(-\sum_{i=1}^{n}\frac{\left(x_{i}-\mu_{0}\right)^{2}}{2\sigma^{2}}\right)} & =\exp\left(\sum_{i=1}^{n}\frac{\left(x_{i}-\mu_{0}\right)^{2}}{2\sigma^{2}}-\frac{\left(x_{i}-\mu_{1}\right)^{2}}{2\sigma^{2}}\right)\\ & =\exp\left(\frac{1}{2\sigma^{2}}\left(\sum_{i=1}^{n}x_{i}^{2}-2\mu_{0}x_{i}+\mu_{0}^{2}-\left(x_{i}^{2}-2\mu_{1}x_{i}+\mu_{1}^{2}\right)\right)\right)\\ & =\exp\left(\frac{1}{2\sigma^{2}}\left(\sum_{i=1}^{n}2x_{i}\left(\mu_{1}-\mu_{0}\right)+\mu_{0}^{2}-\mu_{1}^{2}\right)\right)\\ & =\exp\left(\frac{1}{2\sigma^{2}}\left(2n\overline{x}\left(\mu_{1}-\mu_{0}\right)+n\left(\mu_{0}-\mu_{1}\right)\left(\mu_{0}+\mu_{1}\right)\right)\right)\\ & =\exp\left(\frac{n\left(\mu_{1}-\mu_{0}\right)}{2\sigma^{2}}\left(2\overline{x}-\left(\mu_{0}+\mu_{1}\right)\right)\right)\end{aligned}[/math]
Going back to the original condition and solving it w.r.t. [math]\overline{x}[/math] yields “reject iff”:
[math]\begin{aligned} & \exp\left(\frac{n\left(\mu_{1}-\mu_{0}\right)}{2\sigma^{2}}\left(2\overline{x}-\left(\mu_{0}+\mu_{1}\right)\right)\right)\gt k\\ \Leftrightarrow & \overline{x}\gt \frac{1}{2}\left(\mu_{0}+\mu_{1}+\frac{2\sigma^{2}\log\left(k\right)}{n\left(\mu_{1}-\mu_{0}\right)}\right)\end{aligned}[/math]
Apparently, our test does depend on [math]\mu_{1}[/math], and so it does not seem UMP. However, we will now show that for different values of [math]\mu_{1}[/math], we will end up with the exact same test.
For example, let [math]\alpha=0.05[/math], [math]n=30[/math], [math]\sigma^{2}=1[/math], [math]\mu_{0}=0[/math] and [math]\mu_{1}=0.1[/math]. In this case, the probability of a type 1 error can be written as:
[math]\begin{aligned} Pr\left(\overline{X}\gt \underset{=k^{'}}{\underbrace{\frac{1}{2}\left(0+0.1+\frac{2\log\left(k\right)}{3}\right)}}\right) & =0.05\end{aligned}[/math]
Using the fact that [math]\overline{x}\sim N\left(\mu_{0},\frac{\sigma^{2}}{n}\right)[/math] and solving for [math]k[/math] is approximately equal to [math]2.12[/math], which yields [math]k^{'}=0.3[/math]. So, we will reject the null hypothesis if [math]\overline{x}\gt 0.3[/math]. This produces a level [math]\alpha[/math] test.
Now, suppose that [math]\mu_{1}=0.2[/math].
In this case, the probability of a type 1 error can be written as
[math]Pr\left(\overline{X}\gt \frac{1}{2}\left(0.2+\frac{2\log\left(k\right)}{6}\right)\right)=0.05[/math]
where [math]k[/math] now approximately equals [math]3.33[/math], while [math]k^{'}=0.3[/math] as before.
Our test turns out to be the same, independently of the value of [math]\mu_{1}[/math], i.e., the test does not depend on the value of [math]\mu_{1}[/math]. The intuition for the test not depending on the alternative hypothesis is that the critical value is set so that the test rejects the null, when [math]\mu=\mu_{0}[/math], [math]\alpha\% [/math] of the time. This test only depends on the null hypothesis; not on the alternative hypothesis: We can write our rejection decision as [math]\overline{x}\gt k^{'},[/math] where [math]k^{'}[/math] is whatever constant we need to set the intended probability of type 1 errors.
The Neyman-Pearson lemma states that the inequality above produces the unique UMP level [math]\alpha[/math] test, with a probability of a type 1 error equal to:
[math]\begin{aligned} & P_{\mu_{0}}\left(\overline{X}\gt k^{'}\right)=\alpha\\ \Leftrightarrow & 1-\Phi\left(\frac{\sqrt{n}\left(k^{'}-\mu_{0}\right)}{\sigma}\right)=\alpha.\end{aligned}[/math]