Chapter 7: Problem 3
Let \(Y_{1}
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 3
Let \(Y_{1}
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(Y_{1}
Suppose \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from a distribution
with pdf \(f(x ; \theta)=(1 / 2) \theta^{3} x^{2} e^{-\theta x}, 0
If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from a distribution that has a pdf which is a regular case of the exponential class, show that the pdf of \(Y_{1}=\sum_{1}^{n} K\left(X_{i}\right)\) is of the form \(f_{Y_{1}}\left(y_{1} ; \theta\right)=R\left(y_{1}\right) \exp \left[p(\theta) y_{1}+n q(\theta)\right]\). Hint: Let \(Y_{2}=X_{2}, \ldots, Y_{n}=X_{n}\) be \(n-1\) auxiliary random variables. Find the joint pdf of \(Y_{1}, Y_{2}, \ldots, Y_{n}\) and then the marginal pdf of \(Y_{1}\).
Let \(Y\) denote the median and let \(\bar{X}\) denote the mean of a random sample of size \(n=2 k+1\) from a distribution that is \(N\left(\mu, \sigma^{2}\right)\). Compute \(E(Y \mid \bar{X}=\bar{x})\). Hint: See Exercise \(7.5 .4 .\)
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a distribution that is \(N(\mu, \theta), 0<\theta<\infty\), where \(\mu\) is unknown. Let \(Y=\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} / n\) and let \(\mathcal{L}[\theta, \delta(y)]=[\theta-\delta(y)]^{2}\). If we consider decision functions of the form \(\delta(y)=b y\), where \(b\) does not depend upon \(y\), show that \(R(\theta, \delta)=\left(\theta^{2} / n^{2}\right)\left[\left(n^{2}-1\right) b^{2}-2 n(n-1) b+n^{2}\right]\). Show that \(b=n /(n+1)\) yields a minimum risk decision function of this form. Note that \(n Y /(n+1)\) is not an unbiased estimator of \(\theta\). With \(\delta(y)=n y /(n+1)\) and \(0<\theta<\infty\), determine \(\max _{\theta} R(\theta, \delta)\) if it exists.
What do you think about this solution?
We value your feedback to improve our textbook solutions.