/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 16 Let \(\left(X_{1}, Y_{1}\right),... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(\left(X_{1}, Y_{1}\right),\left(X_{2}, Y_{2}\right), \ldots,\left(X_{n}, Y_{n}\right)\) be a random sample from a bivariate normal distribution with \(\mu_{1}, \mu_{2}, \sigma_{1}^{2}=\sigma_{2}^{2}=\sigma^{2}, \rho=\frac{1}{2}\), where \(\mu_{1}, \mu_{2}\), and \(\sigma^{2}>0\) are unknown real numbers. Find the likelihood ratio \(\Lambda\) for testing \(H_{0}: \mu_{1}=\mu_{2}=0, \sigma^{2}\) unknown against all alternatives. The likelihood ratio \(\Lambda\) is a function of what statistic that has a well- known distribution?

Short Answer

Expert verified
The likelihood ratio \(\Lambda\) is a function of \(\frac{\sum_{i=1}^{n}Y_{i}^{2}-n\overline{Y^2}}{2\overline{X^2}}\). This statistic has a chi-square distribution with \(n-1\) degrees of freedom.

Step by step solution

01

Establish the joint PDF

First, we must establish that the joint probability density function (PDF) of the bivariate normal distribution for a single observation (Xi, Yi) is given by \(f(X_{i},Y_{i};\theta)=\frac{1}{2\pi\sigma^{2}\sqrt{1-\rho^2}}e^{-\frac{1}{2\sigma^2}\frac{(X_{i}-\mu_{1})^2 + (Y_{i}-\mu_{2})^2-2\rho(X_{i}-\mu_{1})(Y_{i}-\mu_{2})}{\sqrt{1-\rho^2}}}\), where \(\rho, \sigma, \mu_{1}, \mu_{2}\) are the correlation coefficient, the standard deviation, and the means of \(X_{i}\) and \(Y_{i}\) respectively.
02

Calculate joint PDF of the sample

Next, we calculate the joint PDF of the whole sample by assuming independence, which gives us the product:\(L(\theta)=\prod_{i=1}^{n}f(X_{i},Y_{i};\theta)\)
03

Establish the hypotheses

Then we establish that for the null hypothesis, \(H_{0}\), \(\mu_{1}=\mu_{2}=0\) and for alternatives \(\mu_{1},\mu_{2}\) can be anything.
04

Calculate the likelihood ratio

Next, we calculate the likelihood ratio, which is defined as the maximum of the likelihood function under the null hypothesis divided by the maximum of the likelihood function under any hypothesis, which yields \(\Lambda =\frac{maxL(\theta|H_{0})}{maxL(\theta)}\).
05

Find the statistic

Finally, we find that the statistic \(\Lambda\) is a function of is actually \(\frac{\sum_{i=1}^{n}Y_{i}^{2}-n\overline{Y^2}}{2\overline{X^2}}\), where \(\overline{X^2}\) represents the sample variance of the \(X_{i}\)'s and \(\overline{Y^2}\) represents the sample variance of the \(Y_{i}\)'s. We find that under \(H_{0}\), this statistic has a chi-square distribution with \(n-1\) degrees of freedom as this follows directly from the properties of the bivariate normal distribution when \(\mu_{1}=\mu_{2}=0\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Illustrative Example \(8.2 .1\) of this section dealt with a random sample of size \(n=2\) from a gamma distribution with \(\alpha=1, \beta=\theta .\) Thus the mgf of the distribution is \((1-\theta t)^{-1}, t<1 / \theta, \theta \geq 2 .\) Let \(Z=X_{1}+X_{2} .\) Show that \(Z\) has a gamma distribution with \(\alpha=2, \beta=\theta\). Express the power function \(\gamma(\theta)\) of Example 8.2.1 in terms of a single integral. Generalize this for a random sample of size \(n\).

Suppose \(X_{1}, \ldots, X_{n}\) is a random sample on \(X\) which has a \(N\left(\mu, \sigma_{0}^{2}\right)\) distribution, where \(\sigma_{0}^{2}\) is known. Consider the two-sided hypotheses $$ H_{0}: \mu=0 \text { versus } H_{1}: \mu \neq 0 $$ Show that the test based on the critical region \(C=\left\\{|\bar{X}|>\sqrt{\sigma_{0}^{2} / n} z_{\alpha / 2}\right\\}\) is an unbiased level \(\alpha\) test.

Suppose that a manufacturing process makes about \(3 \%\) defective items, which is considered satisfactory for this particular product. The managers would like to decrease this to about \(1 \%\) and clearly want to guard against a substantial increase, say to \(5 \%\). To monitor the process, periodically \(n=100\) items are taken and the number \(X\) of defectives counted. Assume that \(X\) is \(b(n=100, p=\theta)\). Based on a sequence \(X_{1}, X_{2}, \ldots, X_{m}, \ldots\), determine a sequential probability ratio test that tests \(H_{0}: \theta=0.01\) against \(H_{1}: \theta=0.05 .\) (Note that \(\theta=0.03\), the present level, is in between these two values.) Write this test in the form $$ h_{0}>\sum_{i=1}^{m}\left(x_{i}-n d\right)>h_{1} $$ and determine \(d, h_{0}\), and \(h_{1}\) if \(\alpha_{a}=\beta_{a}=0.02\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a distribution with pdf \(f(x ; \theta)=\theta x^{\theta-1}, 00 .\) Show the likelihood has mlr in the statistic \(\prod_{i=1}^{n} X_{i}\). Use this to determine the UMP test for \(H_{0}: \theta=\theta^{\prime}\) against \(H_{1}: \theta<\theta^{\prime}\), for fixed \(\theta^{\prime}>0\).

Consider a random sample \(X_{1}, X_{2}, \ldots, X_{n}\) from a distribution with pdf \(f(x ; \theta)=\theta(1-x)^{\theta-1}, 00 .\) (a) Find the form of the uniformly most powerful test of \(H_{0}: \theta=1\) against \(H_{1}: \theta>1\) (b) What is the likelihood ratio \(\Lambda\) for testing \(H_{0}: \theta=1\) against \(H_{1}: \theta \neq 1 ?\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.