Chapter 7: Problem 3
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample of size \(n\) from a
distribution with pdf \(f(x ; \theta)=\theta x^{\theta-1}, 0
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 3
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample of size \(n\) from a
distribution with pdf \(f(x ; \theta)=\theta x^{\theta-1}, 0
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(\left(X_{1}, Y_{1}\right),\left(X_{2}, Y_{2}\right), \ldots,\left(X_{n}, Y_{n}\right)\) denote a random sample of size \(n\) from a bivariate normal distribution with means \(\mu_{1}\) and \(\mu_{2}\), positive variances \(\sigma_{1}^{2}\) and \(\sigma_{2}^{2}\), and correlation coefficient \(\rho .\) Show that \(\sum_{1}^{n} X_{i}, \sum_{1}^{n} Y_{i}, \sum_{1}^{n} X_{i}^{2}, \sum_{1}^{n} Y_{i}^{2}\), and \(\sum_{1}^{n} X_{i} Y_{i}\) are joint complete sufficient statistics for the five parameters. Are \(\bar{X}=\) \(\sum_{1}^{n} X_{i} / n, \bar{Y}=\sum_{1}^{n} Y_{i} / n, S_{1}^{2}=\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} /(n-1), S_{2}^{2}=\sum_{1}^{n}\left(Y_{i}-\bar{Y}\right)^{2} /(n-1)\), and \(\sum_{1}^{n}\left(X_{i}-\bar{X}\right)\left(Y_{i}-\bar{Y}\right) /(n-1) S_{1} S_{2}\) also joint complete sufficient statistics for these parameters?
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a distribution that is, \(N(\mu, \theta), 0<\theta<\infty\), where \(\mu\) is unknown. Let \(Y=\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} / n=V\) and let \(\mathcal{L}[\theta, \delta(y)]=[\theta-\delta(y)]^{2}\). If we consider decision functions of the form \(\delta(y)=b y\), where \(b\) does not depend upon \(y\), show that \(R(\theta, \delta)=\left(\theta^{2} / n^{2}\right)\left[\left(n^{2}-1\right) b^{2}-2 n(n-1) b+n^{2}\right]\). Show that \(b=n /(n+1)\) yields a minimum risk decision functions of this form. Note that \(n Y /(n+1)\) is not an unbiased estimator of \(\theta\). With \(\delta(y)=n y /(n+1)\) and \(0<\theta<\infty\), determine \(\max _{\theta} R(\theta, \delta)\) if it exists.
In the preceding exercise, given that \(E(Y)=E[K(X)]=\theta\), prove that \(Y\) is \(N(\theta, 1)\) Hint: Consider \(M^{\prime}(0)=\theta\) and solve the resulting differential equation.
Let a random sample of size \(n\) be taken from a distribution that has the pdf \(f(x ; \theta)=(1 / \theta) \exp (-x / \theta) I_{(0, \infty)}(x) .\) Find the mle and the MVUE of \(P(X \leq 2)\)
Let \(Y_{1}
What do you think about this solution?
We value your feedback to improve our textbook solutions.