/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 5 Let \(X_{1}, X_{2}, \ldots, X_{n... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a \(N\left(\mu_{0}, \sigma^{2}=\theta\right)\) distribution, where \(0<\theta<\infty\) and \(\mu_{0}\) is known. Show that the likelihood ratio test of \(H_{0}: \theta=\theta_{0}\) versus \(H_{1}: \theta \neq \theta_{0}\) can be based upon the statistic \(W=\sum_{i=1}^{n}\left(X_{i}-\mu_{0}\right)^{2} / \theta_{0}\). Determine the null distribution of \(W\) and give, explicitly, the rejection rule for a level \(\alpha\) test.

Short Answer

Expert verified
The null distribution of the likelihood ratio test statistic \(W\) is \(\chi^{2}_{n}\). The rejection rule for a level \(\alpha\) test is that \(H_{0}\) is rejected if \(W>n + \chi^{2}_{n,\alpha/2}\) or \(W<n-\chi^{2}_{n,\alpha/2}\).

Step by step solution

01

Calculate the Maximum Likelihood Estimates

Begin with the normal distribution \(N\left(\mu_{0}, \sigma^2=\theta\right)\) and calculate the likelihood function \(L(\theta;\mathbf{x})=\left(\frac{1}{\sqrt{2\pi\theta}}\right)^n \exp\left(-\frac{\sum_{i=1}^{n}\left(X_{i}-\mu_{0}\right)^{2}}{2\theta}\right)\). Under \(H_{0}\), \(\hat{\theta}_0=\frac{\sum_{i=1}^{n}\left(X_{i}-\mu_{0}\right)^{2}}{n}\). Under \(H_{1}\), the maximum likelihood estimator \(\hat{\theta}_1\) would take the same form as \(\hat{\theta}_0\) because the estimator does not depend on the value of \(\theta\).
02

Define the Likelihood Ratio Test and \(\lambda (\mathbf{x})\)

The likelihood ratio test statistic is given by \(\lambda(\mathbf{x})=\frac{\sup_{\theta= \theta_0}L(\theta;\mathbf{x})}{\sup_{\theta}L(\theta;\mathbf{x})} = \frac{L(\theta_0;\mathbf{x})}{L(\hat{\theta}_1;\mathbf{x})}\). The definitions of our maximum likelihood estimates under \(H_{0}\) and \(H_{1}\) make it clear that in this case \(\lambda(\mathbf{x})= \left(\frac{\theta_0}{\hat{\theta}_1}\right)^{n/2} \exp\left( \frac{n}{2}\left(1-\frac{\theta_0}{\hat{\theta}_1}\right) \right)\).
03

Find the Null Distribution and the Rejection Rule

The null distribution is given by twice the natural logarithm of \(\lambda(\mathbf{x})\), or \(2\ln(\lambda(\mathbf{x}))\), and is distributed as \(\chi^{2}_{n}\). The likelihood ratio test rejects \(H_{0}\) if \(2\ln(\lambda(\mathbf{x}))\) falls in the rejection region of the \(\chi^{2}_{n}\) distribution. This means our test statistic \(W=\sum_{i=1}^{n}\left(X_{i}-\mu_{0}\right)^{2} / \theta_{0}\) falls in the rejection region if \(W>n + \chi^{2}_{n,\alpha/2}\) or \(W<n-\chi^{2}_{n,\alpha/2}\), where \(\chi^{2}_{n,\alpha/2}\) is the \(\alpha/2\) quantile of the \(\chi^{2}_{n}\) distribution.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Consider a location model $$X_{i}=\theta+e_{i}, \quad i=1, \ldots, n$$ where \(e_{1}, e_{2}, \ldots, e_{n}\) are iid with pdf \(f(z) .\) There is a nice geometric interpretation for estimating \(\theta .\) Let \(\mathbf{X}=\left(X_{1}, \ldots, X_{n}\right)^{\prime}\) and \(\mathbf{e}=\left(e_{1}, \ldots, e_{n}\right)^{\prime}\) be the vectors of observations and random error, respectively, and let \(\boldsymbol{\mu}=\theta \mathbf{1}\), where \(\mathbf{1}\) is a vector with all components equal to \(1 .\) Let \(V\) be the subspace of vectors of the form \(\mu\); i.e., \(V=\\{\mathbf{v}: \mathbf{v}=a \mathbf{1}\), for some \(a \in R\\} .\) Then in vector notation we can write the model as $$ \mathbf{X}=\boldsymbol{\mu}+\mathbf{e}, \quad \boldsymbol{\mu} \in V $$ Then we can summarize the model by saying, "Except for the random error vector e, \(\mathbf{X}\) would reside in \(V . "\) Hence, it makes sense intuitively to estimate \(\boldsymbol{\mu}\) by a vector in \(V\) which is "closest" to \(\mathbf{X}\). That is, given a norm \(\|\cdot\|\) in \(R^{n}\), choose $$\widehat{\boldsymbol{\mu}}=\operatorname{Argmin}\|\mathbf{X}-\mathbf{v}\|, \quad \mathbf{v} \in V$$ (a) If the error pdf is the Laplace, \((2.2 .1)\), show that the minimization in \((6.3 .27)\) is equivalent to maximizing the likelihood when the norm is the \(l_{1}\) norm given by $$\|\mathbf{v}\|_{1}=\sum_{i=1}^{n}\left|v_{i}\right|$$ (b) If the error pdf is the \(N(0,1)\), show that the minimization in \((6.3 .27)\) is equivalent to maximizing the likelihood when the norm is given by the square of the \(l_{2}\) norm $$\|\mathbf{v}\|_{2}^{2}=\sum_{i=1}^{n} v_{i}^{2}$$

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Bernoulli distribution with parameter \(p .\) If \(p\) is restricted so that we know that \(\frac{1}{2} \leq p \leq 1\), find the mle of this parameter.

A machine shop that manufactures toggle levers has both a day and a night shift. A toggle lever is defective if a standard nut cannot be screwed onto the threads. Let \(p_{1}\) and \(p_{2}\) be the proportion of defective levers among those manufactured by the day and night shifts, respectively. We shall test the null hypothesis, \(H_{0}: p_{1}=p_{2}\), against a two-sided alternative hypothesis based on two random samples, each of 1000 levers taken from the production of the respective shifts. Use the test statistic \(Z^{*}\) given in Example \(6.5 .3\). (a) Sketch a standard normal pdf illustrating the critical region having \(\alpha=0.05\). (b) If \(y_{1}=37\) and \(y_{2}=53\) defectives were observed for the day and night shifts, respectively, calculate the value of the test statistic and the approximate \(p\) value (note that this is a two-sided test). Locate the calculated test statistic on your figure in part (a) and state your conclusion. Obtain the approximate \(p\) -value of the test.

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from the Poisson distribution with \(0<\theta \leq 2\). Show that the mle of \(\theta\) is \(\widehat{\theta}=\min \\{\bar{X}, 2\\}\).

Let \(\left(X_{1}, Y_{1}\right),\left(X_{2}, Y_{2}\right), \ldots,\left(X_{n}, Y_{n}\right)\) be a random sample from a bivariate normal distribution with \(\mu_{1}, \mu_{2}, \sigma_{1}^{2}=\sigma_{2}^{2}=\sigma^{2}, \rho=\frac{1}{2}\), where \(\mu_{1}, \mu_{2}\), and \(\sigma^{2}>0\) are unknown real numbers. Find the likelihood ratio \(\Lambda\) for testing \(H_{0}: \mu_{1}=\mu_{2}=0, \sigma^{2}\) unknown against all alternatives. The likelihood ratio \(\Lambda\) is a function of what statistic that has a well- known distribution?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.