Lecture 15. E) Multiple Parameters

From Significant Statistics
Jump to navigation Jump to search

Example: Multiple Parameters

Let [math]X_{i}\overset{iid}{\sim}N\left(\mu,\sigma^{2}\right)[/math] where [math]\mu[/math] and [math]\sigma^{2}[/math] are unknown.

Suppose we have observations s.t. [math]\overline{x}=1[/math] and [math]\frac{\Sigma x_{i}^{2}}{n}=6[/math] and face the testing problem [math]H_{0}:\,\sigma^{2}=4[/math] vs. [math]H_{1}:\,\sigma^{2}\neq4[/math] at the 0.1 level.

Log-Likelihood

[math]l\left(\theta\right)=\sum_{i=1}^{n}-\frac{1}{2}\log\left(\sigma^{2}\right)-\frac{x_{i}^{2}-2\mu x_{i}+\mu^{2}}{2\sigma^{2}}[/math]

First-Order Conditions:

[math]\begin{aligned} \frac{\partial l\left(\theta\right)}{\partial\mu}= & \sum_{i=1}^{n}\frac{x_{i}-\mu}{\sigma^{2}}\\ \frac{\partial l\left(\theta\right)}{\partial\sigma^{2}}= & \sum_{i=1}^{n}-\frac{1}{2\sigma^{2}}+\frac{x_{i}^{2}-2\mu x_{i}+\mu^{2}}{2\sigma^{4}}\end{aligned}[/math]

Solving the focs, we get

[math]\begin{aligned} \widehat{\mu}_{ML}= & \overline{x}=1.\\ \widehat{\sigma^{2}}_{ML}= & \frac{\sum x_{i}^{2}}{n}-\left(\frac{\sum x_{i}}{n}\right)^{2}=5.\end{aligned}[/math]

Information Matrix

The information matrix for a single observation equals

[math]I_{1}=-E\left[\begin{array}{cc} \frac{\partial^{2}}{\partial\mu^{2}}l_{1}\left(\theta\right) & \frac{\partial^{2}}{\partial\mu\partial\sigma^{2}}l_{1}\left(\theta\right)\\ \frac{\partial^{2}}{\partial\sigma^{2}\partial\mu}l_{1}\left(\theta\right) & \frac{\partial^{2}}{\partial\left(\sigma^{2}\right)^{2}}l_{1}\left(\theta\right) \end{array}\right]=-E\left[\begin{array}{cc} -\frac{1}{\sigma^{2}} & -\frac{x_{i}-\mu}{\sigma^{4}}\\ -\frac{x_{i}-\mu}{\sigma^{4}} & \frac{1}{2\sigma^{4}}-\frac{\left(x_{i}-\mu\right)^{2}}{\sigma^{6}} \end{array}\right].[/math]

Taking expectations (the expectation operator applies to each member of the matrix) yields

[math]I_{1}=\left[\begin{array}{cc} \frac{1}{\sigma^{2}} & 0\\ 0 & \frac{1}{2\sigma^{4}} \end{array}\right][/math]

We now calculate the information matrix at the null hypothesis as well as at the value of its maximum likelihood estimate:

  • [math]I_{1}\left(\widehat{\mu}_{ML},\sigma_{0}^{2}\right)=I\left(1,4\right)=\left[\begin{array}{cc} \frac{1}{4} & 0\\ 0 & \frac{1}{32} \end{array}\right][/math]
  • [math]I_{1}\left(\widehat{\mu}_{ML},\widehat{\sigma^{2}}_{ML}\right)=I\left(1,5\right)=\left[\begin{array}{cc} \frac{1}{5} & 0\\ 0 & \frac{1}{50} \end{array}\right][/math]

Confidence Interval

Note that [math]Var\left(\begin{array}{c} \sqrt{n}\left(\widehat{\mu}-\mu\right)\\ \sqrt{n}\left(\widehat{\sigma^{2}}-\sigma^{2}\right) \end{array}\right)=I_{1}^{-1}=\left[\begin{array}{cc} \sigma^{2} & 0\\ 0 & 2\sigma^{4} \end{array}\right].[/math]

(Operation [math]I^{-1}[/math] does not mean taking the inverse of the elements of matrix [math]I[/math]; it’s the matrix inverse operation, which coincides with the matrix of the inverses in this case.)

Hence, [math]\widehat{Var}\left(\sqrt{n}\left(\widehat{\sigma^{2}}_{ML}-\sigma^{2}\right)\right)=\left.2\sigma^{4}\right|_{\sigma^{2}=5}=50[/math], and the resulting CI is

[math]CI:\,\left(\widehat{\sigma^{2}}_{ML}-1.96\sqrt{\frac{50}{n}},\widehat{\sigma^{2}}_{ML}+1.96\sqrt{\frac{50}{n}}\right)=3.61,6.39.[/math]

Wald Test

For the Wald test,

[math]T_{W}=\frac{\left(\widehat{\sigma^{2}}_{ML}-\sigma^{2}\right)^{2}}{I_{22}^{-1}\left(\widehat{\sigma^{2}}_{ML}\right)}=\frac{1^{2}}{\left(\frac{50}{100}\right)}=2.[/math]

The null hypothesis is not rejected, since the critical value is 2.71.

LM Test

For the LM test,

[math]\begin{aligned} T_{LM} & =\frac{1}{n}\left[l^{'}\left(\theta_{0}\right)\right]^{'}I^{-1}\left(\theta_{0}\right)\left[l^{'}\left(\theta_{0}\right)\right]=\\ & =\frac{1}{n}\left[\begin{array}{c} \sum_{i=1}^{n}\frac{x_{i}-\mu_{0}}{\sigma_{0}^{2}}\\ \sum_{i=1}^{n}-\frac{1}{2\sigma_{0}^{2}}+\frac{\left(x_{i}-\mu_{0}\right)^{2}}{2\sigma_{0}^{4}} \end{array}\right]^{'}\left[\begin{array}{cc} \sigma_{0}^{2} & 0\\ 0 & 2\sigma_{0}^{4} \end{array}\right]\left[\begin{array}{c} \sum_{i=1}^{n}\frac{x_{i}-\mu_{0}}{\sigma_{0}^{2}}\\ \sum_{i=1}^{n}-\frac{1}{2\sigma_{0}^{2}}+\frac{\left(x_{i}-\mu_{0}\right)^{2}}{2\sigma_{0}^{4}} \end{array}\right]\\ & =3.125.\end{aligned}[/math]

such that we reject [math]H_{0}[/math].

LR Test

For the LR test,

[math]\begin{aligned} l\left(\mu=1,\sigma^{2}=4\right) & =-131.81\\ l\left(\mu=1,\widehat{\sigma^{2}}=5\right) & =-130.47\end{aligned}[/math]

s.t.

[math]T_{LR}=2\left(l\left(\widehat{\theta}_{ML}\right)-l\left(\theta_{0}\right)\right)=2.69.[/math]

and we do not reject the null hypothesis.

Clearly, in this example, [math]n[/math] is not large enough for the tests to converge.