/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 8 Let \(X\) be \(N(0, \theta), 0&l... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(X\) be \(N(0, \theta), 0<\theta<\infty\). (a) Find the Fisher information \(I(\theta)\). (b) If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from this distribution, show that the mle of \(\theta\) is an efficient estimator of \(\theta\). (c) What is the asymptotic distribution of \(\sqrt{n}(\widehat{\theta}-\theta) ?\)

Short Answer

Expert verified
The Fisher Information, \(I(\theta)\), is obtained from the second derivative of the natural logarithm of the likelihood function of the Normal distribution. To show that the MLE is an efficient estimator, we find the MLE of \( \theta \), and then its variance, which should be equal to the Cramer-Rao Lower Bound (CRLB) to signify efficiency. The asymptotic distribution of the estimator, \( \sqrt{n}(\widehat{\theta}-\theta) \), is Normal according to the Central Limit Theorem.

Step by step solution

01

Find the Fisher Information \(I(\theta)\)

Fisher information, \(I(\theta)\), can be obtained from the likelihood function of Normal distribution \(N(0, \theta)\). The likelihood function, \(L(\theta)\), is:\(L(\theta) = \(1 / (\sqrt{2\pi\theta})\) * \(e^{(-x^2 / 2\theta)}\)Taking the natural logarithm (ln) of the likelihood function, and then finding the second derivative, gives the Fisher Information.
02

Show that MLE is an Efficient Estimator of \( \theta \)

To prove that MLE is an efficient estimator, we start by finding the MLE of \( \theta \). The MLE is obtained by setting the derivative of the natural logarithm of the likelihood function to zero and solving for \( \theta \). Once we have the MLE estimator, we find its variance, i.e. the Cramer-Rao Lower Bound (CRLB). If the variance of the MLE estimator equals the CRLB, it suggests that the MLE is an efficient estimator of \( \theta \).
03

Find Asymptotic Distribution of the Estimator

The asymptotic distribution of an estimator is usually Normal, according to the Central Limit Theorem, with mean equal to the parameter being estimated and variance equal to the inverse of Fisher Information scaled by sample size, i.e., \(n\). Hence, if \( \widehat{\theta} \) is the estimator, its asymptotic distribution is represented as \( \sqrt{n}(\widehat{\theta}-\theta) \). We can confirm this by evaluating the limit as \(n\) approaches infinity of \( \sqrt{n}(\widehat{\theta}-\theta) \).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Poisson distribution with mean \(\theta>0 .\) Test \(H_{0}: \theta=2\) against \(H_{1}: \theta \neq 2\) using (a) \(-2 \log \Lambda\) (b) A Wald-type statistic. (c) Rao's score statistic.

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a distribution with pdf \(f(x ; \theta)=\theta \exp \left\\{-|x|^{\theta}\right\\} / 2 \Gamma(1 / \theta),-\infty0 .\) Suppose \(\Omega=\) \(\\{\theta: \theta=1,2\\} .\) Consider the hypotheses \(H_{0}: \theta=2\) (a normal distribution) versus \(H_{1}: \theta=1\) (a double exponential distribution). Show that the likelihood ratio test can be based on the statistic \(W=\sum_{i=1}^{n}\left(X_{i}^{2}-\left|X_{i}\right|\right)\).

Given \(f(x ; \theta)=1 / \theta, 00\), formally compute the reciprocal of $$n E\left\\{\left[\frac{\partial \log f(X: \theta)}{\partial \theta}\right]^{2}\right\\}$$ Compare this with the variance of \((n+1) Y_{n} / n\), where \(Y_{n}\) is the largest observation of a random sample of size \(n\) from this distribution. Comment.

Let \(X_{1}, X_{2}, \ldots, X_{n}\) and \(Y_{1}, Y_{2}, \ldots, Y_{m}\) be independent random samples from the two normal distributions \(N\left(0, \theta_{1}\right)\) and \(N\left(0, \theta_{2}\right)\). (a) Find the likelihood ratio \(\Lambda\) for testing the composite hypothesis \(H_{0}: \theta_{1}=\theta_{2}\) against the composite alternative \(H_{1}: \theta_{1} \neq \theta_{2}\). (b) This \(\Lambda\) is a function of what \(F\) -statistic that would actually be used in this test?

Let \(\left(X_{1}, Y_{1}\right),\left(X_{2}, Y_{2}\right), \ldots,\left(X_{n}, Y_{n}\right)\) be a random sample from a bivariate normal distribution with \(\mu_{1}, \mu_{2}, \sigma_{1}^{2}=\sigma_{2}^{2}=\sigma^{2}, \rho=\frac{1}{2}\), where \(\mu_{1}, \mu_{2}\), and \(\sigma^{2}>0\) are unknown real numbers. Find the likelihood ratio \(\Lambda\) for testing \(H_{0}: \mu_{1}=\mu_{2}=0, \sigma^{2}\) unknown against all alternatives. The likelihood ratio \(\Lambda\) is a function of what statistic that has a well- known distribution?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.