/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 2 Let \(X_{1}, X_{2}, \ldots, X_{n... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a normal distribution with mean zero and variance \(\theta, 0<\theta<\infty\). Show that \(\sum_{1}^{n} X_{i}^{2} / n\) is an unbiased estimator of \(\theta\) and has variance \(2 \theta^{2} / n\).

Short Answer

Expert verified
The quantity \(\sum_{1}^{n} X_{i}^{2} / n\) is an unbiased estimator of \(\theta\) with a variance of \(2 \theta^{2} / n\).

Step by step solution

01

Establish the expected value

First, it's important to find the expected value of the estimator to show that it is indeed unbiased. By the definition of expectation and since \(X_{i}\) are identically distributed,\[E[X_{i}^{2}] = \text{Var}(X_{i}) + [E(X_{i})]^{2} = \theta.\]Hence,\[E \left[\frac{1}{n}\sum_{i=1}^{n} X_{i}^{2}\right] = E[X_{i}^{2}] = \theta.\]Since the expected value of the estimator equals the parameter being estimated, we can conclude that \(\frac{1}{n}\sum_{i=1}^{n} X_{i}^{2}\) is an unbiased estimator of \(\theta\).
02

Calculate the variance

Now, calculate the variance of the estimator to show that it is \(2 \theta^{2} / n\). The square of a normal random variable \(X_{i}^2\) follows a Chi-square distribution with 1 degree of freedom, which has mean \(\theta\) and variance \(2\theta^2\). Hence,\[Var \left[\frac{1}{n}\sum_{i=1}^{n} X_{i}^{2}\right] = \frac{1}{n^2} \sum_{i=1}^{n} Var[X_{i}^{2}] = \frac{1}{n^2} \cdot n \cdot 2\theta^2 = \frac{2\theta^{2}}{n}.\]Thus the variance of the estimator \(\frac{1}{n}\sum_{i=1}^{n} X_{i}^{2}\) is indeed \(2 \theta^{2} / n\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from \(N\left(\theta_{1}, \theta_{2}\right) .\) (a) If the constant \(b\) is defined by the equation \(P(X \leq b)=0.90\), find the mle and the MVUE of \(b\). (b) If \(c\) is a given constant, find the mle and the MVUE of \(P(X \leq c)\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a distribution with pmf \(p(x ; \theta)=\theta^{x}(1-\theta), x=0,1,2, \ldots\), zero elsewhere, where \(0 \leq \theta \leq 1\) (a) Find the mle, \(\hat{\theta}\), of \(\theta\). (b) Show that \(\sum_{1}^{n} X_{i}\) is a complete sufficient statistic for \(\theta\). (c) Determine the MVUE of \(\theta\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample with the common pdf \(f(x)=\) \(\theta^{-1} e^{-x / \theta}\), for \(x>0\), zero elsewhere; that is, \(f(x)\) is a \(\Gamma(1, \theta)\) pdf. (a) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\theta\). (b) Determine the MVUE of \(\theta\). (c) Determine the mle of \(\theta\). (d) Often, though, this pdf is written as \(f(x)=\tau e^{-\tau x}\), for \(x>0\), zero elsewhere. Thus \(\tau=1 / \theta\). Use Theorem \(6.1 .2\) to determine the mle of \(\tau\). (e) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\tau\). Show that \((n-1) /(n \bar{X})\) is the MVUE of \(\tau=1 / \theta\). Hence, as usual, the reciprocal of the mle of \(\theta\) is the mle of \(1 / \theta\), but, in this situation, the reciprocal of the MVUE of \(\theta\) is not the MVUE of \(1 / \theta\). (f) Compute the variances of each of the unbiased estimators in parts (b) and (e).

Prove that the sum of the observations of a random sample of size \(n\) from a Poisson distribution having parameter \(\theta, 0<\theta<\infty\), is a sufficient statistic for \(\theta\).

. Let \(Y_{1}\) and \(Y_{2}\) be two independent unbiased estimators of \(\theta\). Assume that the variance of \(Y_{1}\) is twice the variance of \(Y_{2}\). Find the constants \(k_{1}\) and \(k_{2}\) so that \(k_{1} Y_{1}+k_{2} Y_{2}\) is an unbiased estimator with the smallest possible variance for such a linear combination.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.