/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 6 Let \(X_{1}, X_{2}, \ldots, X_{n... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from the uniform distribution with pdf \(f\left(x ; \theta_{1}, \theta_{2}\right)=1 /\left(2 \theta_{2}\right), \theta_{1}-\theta_{2}0\) and the pdf is equal to zero elsewhere. (a) Show that \(Y_{1}=\min \left(X_{i}\right)\) and \(Y_{n}=\max \left(X_{i}\right)\), the joint sufficient statistics for \(\theta_{1}\) and \(\theta_{2}\), are complete. (b) Find the MVUEs of \(\theta_{1}\) and \(\theta_{2}\).

Short Answer

Expert verified
The joint sufficient statistics, \(Y_1 = \min(X_i)\) and \(Y_n = \max(X_i)\), are complete. The MVUE of \(\theta_1\) is \(E(Y_1+Y_n) / 2\) and that of \(\theta_2\) is \(E(Y_n-Y_1) / 2\).

Step by step solution

01

Prove completeness of joint sufficient statistics

It's given that \(Y_1 = \min(X_i)\) and \(Y_n = \max(X_i)\) are the joint sufficient statistics for \(\theta_1\) and \(\theta_2\). To show that these statistics are complete, we need to prove that if a function \(g(Y_1, Y_n)\) of the statistics has expected value of zero for all values of \(\theta_1\) and \(\theta_2\), then \(g(Y_1, Y_n)\) equals zero with probability 1.
02

Determine pdf's of \(Y_1\) and \(Y_n\)

Since \(Y_1 = \min(X_i)\) and \(Y_n = \max(X_i)\), their pdf's can be found using the method of transformations for pdf's. We need to derive these pdf's to find the MVUEs of \(\theta_1\) and \(\theta_2\). The pdf of \(Y_1\) and \(Y_n\) respectively will be: \(f_{Y_1}(y) = n*[(1-2\theta_2)^{n-1}] * 1/(2\theta_2)\) and \(f_{Y_n}(y) = n*[2\theta_2^{n-1}]* 1/(2\theta_2)\).
03

Find MVUEs of \(\theta_1\) and \(\theta_2\)

The Minimum Variance Unbiased Estimates (MVUEs) of the parameters can be obtained using the calculated pdf's of \(Y_1\) and \(Y_n\). The unbiased estimate of a parameter is the expected value of the statistic. So, we can write: \(\theta_1 = E(Y_1 + Y_n) / 2\) and \(\theta_2 = E(Y_n - Y_1) / 2 \). The expectations can be computed using the pdf's derived in step 2.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(X\) have the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for \(-\theta0\) (a) Is the statistic \(Y=|X|\) a sufficient statistic for \(\theta ?\) Why? (b) Let \(f_{Y}(y ; \theta)\) be the pdf of \(Y\). Is the family \(\left\\{f_{Y}(y ; \theta): \theta>0\right\\}\) complete? Why?

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Poisson distribution with parameter \(\theta>0\) (a) Find the MVUE of \(P(X \leq 1)=(1+\theta) e^{-\theta}\). Hint: \(\quad\) Let \(u\left(x_{1}\right)=1, x_{1} \leq 1\), zero elsewhere, and find \(E\left[u\left(X_{1}\right) \mid Y=y\right]\), where \(Y=\sum_{1}^{n} X_{i}\). (b) Express the MVUE as a function of the mle. (c) Determine the asymptotic distribution of the mle.

What is the sufficient statistic for \(\theta\) if the sample arises from a beta distribution in which \(\alpha=\beta=\theta>0 ?\)

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with parameter \(\theta>0\). From the Remark of this section, we know that \(E\left[(-1)^{X_{1}}\right]=e^{-2 \theta}\) (a) Show that \(E\left[(-1)^{X_{1}} \mid Y_{1}=y_{1}\right]=(1-2 / n)^{y_{1}}\), where \(Y_{1}=X_{1}+X_{2}+\cdots+X_{n}\). Hint: First show that the conditional pdf of \(X_{1}, X_{2}, \ldots, X_{n-1}\), given \(Y_{1}=y_{1}\), is multinomial, and hence that of \(X_{1}\) given \(Y_{1}=y_{1}\) is \(b\left(y_{1}, 1 / n\right)\). (b) Show that the mle of \(e^{-2 \theta}\) is \(e^{-2 \bar{X}}\). (c) Since \(y_{1}=n \bar{x}\), show that \((1-2 / n)^{y_{1}}\) is approximately equal to \(e^{-2 \bar{x}}\) when \(n\) is large.

Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with parameter \(\theta, 0<\theta<\infty .\) Let \(Y=\sum_{1}^{n} X_{i}\) and let \(\mathcal{L}[\theta, \delta(y)]=[\theta-\delta(y)]^{2}\). If we restrict our considerations to decision functions of the form \(\delta(y)=b+y / n\), where \(b\) does not depend on \(y\), show that \(R(\theta, \delta)=b^{2}+\theta / n .\) What decision function of this form yields a uniformly smaller risk than every other decision function of this form? With this solution, say \(\delta\), and \(0<\theta<\infty\), determine \(\max _{\theta} R(\theta, \delta)\) if it exists.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.