Chapter 7: Problem 7
Let \(X\) have the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for
\(-\theta
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 7
Let \(X\) have the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for
\(-\theta
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample with the common pdf \(f(x)=\) \(\theta^{-1} e^{-x / \theta}\), for \(x>0\), zero elsewhere; that is, \(f(x)\) is a \(\Gamma(1, \theta)\) pdf. (a) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\theta\). (b) Determine the MVUE of \(\theta\). (c) Determine the mle of \(\theta\). (d) Often, though, this pdf is written as \(f(x)=\tau e^{-\tau x}\), for \(x>0\), zero elsewhere. Thus \(\tau=1 / \theta\). Use Theorem \(6.1 .2\) to determine the mle of \(\tau\). (e) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\tau\). Show that \((n-1) /(n X)\) is the MVUE of \(\tau=1 / \theta\). Hence, as usual the reciprocal of the mle of \(\theta\) is the mle of \(1 / \theta\), but, in this situation, the reciprocal of the MVUE of \(\theta\) is not the MVUE of \(1 / \theta\). (f) Compute the variances of each of the unbiased estimators in Parts (b) and (e).
If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from a distribution that has a pdf which is a regular case of the exponential class, show that the pdf of \(Y_{1}=\sum_{1}^{n} K\left(X_{i}\right)\) is of the form \(f_{Y_{1}}\left(y_{1} ; \theta\right)=R\left(y_{1}\right) \exp \left[p(\theta) y_{1}+n q(\theta)\right]\). Hint: Let \(Y_{2}=X_{2}, \ldots, Y_{n}=X_{n}\) be \(n-1\) auxiliary random variables. Find the joint pdf of \(Y_{1}, Y_{2}, \ldots, Y_{n}\) and then the marginal pdf of \(Y_{1}\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with parameter \(\theta, 0<\theta<\infty .\) Let \(Y=\sum_{1}^{n} X_{i}\) and let \(\mathcal{L}[\theta, \delta(y)]=[\theta-\delta(y)]^{2}\). If we restrict our considerations to decision functions of the form \(\delta(y)=b+y / n\), where \(b\) does not depend on \(y\), show that \(R(\theta, \delta)=b^{2}+\theta / n .\) What decision function of this form yields a uniformly smaller risk than every other decision function of this form? With this solution, say \(\delta\), and \(0<\theta<\infty\), determine \(\max _{\theta} R(\theta, \delta)\) if it exists.
Let \(X\) and \(Y\) be random variables such that \(E\left(X^{k}\right)\) and \(E\left(Y^{k}\right) \neq 0\) exist for \(k=1,2,3, \ldots . .\) If the ratio \(X / Y\) and its denominator \(Y\) are independent, prove that \(E\left[(X / Y)^{k}\right]=E\left(X^{k}\right) / E\left(Y^{k}\right), k=1,2,3, \ldots\) Hint: Write \(E\left(X^{k}\right)=E\left[Y^{k}(X / Y)^{k}\right]\).
. Let \(X_{1}, \ldots, X_{n}\) be a random sample from a distribution of the continuous type with cdf \(F(x)\). Let \(\theta=P\left(X_{1} \leq a\right)=F(a)\), where \(a\) is known. Show that the proportion \(n^{-1} \\#\left\\{X_{i} \leq a\right\\}\) is the MVUE of \(\theta\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.