/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 10 Let \(X_{1}, X_{2}, \ldots, X_{n... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a \(N(0, \theta)\) distribution. We want to estimate the standard deviation \(\sqrt{\theta}\). Find the constant \(c\) so that \(Y=\) \(c \sum_{i=1}^{n}\left|X_{i}\right|\) is an unbiased estimator of \(\sqrt{\theta}\) and determine its efficiency.

Short Answer

Expert verified
The constant \(c=\sqrt{\frac{2\pi}{n}}\), makes the estimator \(Y=\sum_{i=1}^{n} |X_{i}|\) an unbiased estimator of \(\sqrt{\theta}\). The efficiency of the estimator cannot be evaluated because the Cramér–Rao lower bound for the parameter \(\sqrt{\theta}\) does not exist.

Step by step solution

01

Definition of Unbiased Estimator

An estimator is termed as unbiased if its expected value is equal to the true parameter value that it is estimating. To find the constant \(c\), one needs to calculate the expected value of the estimator \(Y = c\sum_{i=1}^{n} |X_{i}|\) and set this equal to what it is estimating which is, \(\sqrt{\theta}\).
02

Derive the formula for the estimator

Multiply \(Y\) with the random variable \(X_i\), which gives \(YX_{i} = c|X_{i}|\). The expected value of \(YX_i\) will be \(E(YX_{i}) = cE(|X_{i}|)\). Given that the \(X_i\) are from a normal distribution with mean zero and standard deviation \(\sqrt{\theta}\), one can look up that \(E(|X_{i}|) = \sqrt{\theta} / \sqrt{2/\pi}\). Hence, \(E(Y) = c*n*\sqrt{\theta/2\pi}\). To make this estimator unbiased, we set \(E(Y)=\sqrt{\theta}\), which gives the constant \(c=\sqrt{\frac{2\pi}{n}}\).
03

Calculate the efficiency

Efficiency measures how 'precise' an estimator is, usually measured by the estimator's variance. Efficiency is calculated as the ratio of the variance of the theoretical best estimator and the variance of the given estimator. The theoretical best estimator for a parameter is the estimator that achieves the smallest possible variance, also known as the Cramér–Rao lower bound. Since, we can't find such estimator for \(\sqrt{\theta}\), it isn't possible to calculate the efficiency for this case.
04

Conclusion

The constant \(c=\sqrt{\frac{2\pi}{n}}\) makes \(Y=\sum_{i=1}^{n} |X_{i}|\) an unbiased estimator for \(\sqrt{\theta}\). The efficiency of this estimator \(Y\) cannot be evaluated against the Cramér–Rao lower bound as that bound cannot be derived for the parameter \(\sqrt{\theta}\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Recall that \(\widehat{\theta}=-n / \sum_{i=1}^{n} \log X_{i}\) is the mle of \(\theta\) for a \(\operatorname{Beta}(\theta, 1)\) distribution. Also, \(W=-\sum_{i=1}^{n} \log X_{i}\) has the gamma distribution \(\Gamma(n, 1 / \theta)\). (a) Show that \(2 \theta W\) has a \(\chi^{2}(2 n)\) distribution. (b) Using Part (a), find \(c_{1}\) and \(c_{2}\) so that $$P\left(c_{1}<\frac{2 \theta n}{\hat{\theta}}

Consider two Bernoulli distributions with unknown parameters \(p_{1}\) and \(p_{2}\). If \(Y\) and \(Z\) equal the numbers of successes in two independent random samples, each of size \(n\), from the respective distributions, determine the mles of \(p_{1}\) and \(p_{2}\) if we know that \(0 \leq p_{1} \leq p_{2} \leq 1\)

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a \(N\left(\mu_{0}, \sigma^{2}=\theta\right)\) distribution, where \(0<\theta<\infty\) and \(\mu_{0}\) is known. Show that the likelihood ratio test of \(H_{0}: \theta=\theta_{0}\) versus \(H_{1}: \theta \neq \theta_{0}\) can be based upon the statistic \(W=\sum_{i=1}^{n}\left(X_{i}-\mu_{0}\right)^{2} / \theta_{0}\) Determine the null distribution of \(W\) and give, explicitly, the rejection rule for a level \(\alpha\) test.

Let \(X\) have a gamma distribution with \(\alpha=4\) and \(\beta=\theta>0\). (a) Find the Fisher information \(I(\theta)\). (b) If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from this distribution, show that the mle of \(\theta\) is an efficient estimator of \(\theta\). (c) What is the asymptotic distribution of \(\sqrt{n}(\widehat{\theta}-\theta) ?\)

Let \(\left(X_{1}, Y_{1}\right),\left(X_{2}, Y_{2}\right), \ldots,\left(X_{n}, Y_{n}\right)\) be a random sample from a bivariate normal distribution with \(\mu_{1}, \mu_{2}, \sigma_{1}^{2}=\sigma_{2}^{2}=\sigma^{2}, \rho=\frac{1}{2}\), where \(\mu_{1}, \mu_{2}\), and \(\sigma^{2}>0\) are unknown real numbers. Find the likelihood ratio \(\Lambda\) for testing \(H_{0}: \mu_{1}=\mu_{2}=0, \sigma^{2}\) unknown against all alternatives. The likelihood ratio \(\Lambda\) is a function of what statistic that has a well- known distribution?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.