Chapter 7: Problem 11
Show that \(Y=|X|\) is a complete sufficient statistic for \(\theta>0\), where \(X\)
has the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for \(-\theta
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 11
Show that \(Y=|X|\) is a complete sufficient statistic for \(\theta>0\), where \(X\)
has the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for \(-\theta
All the tools & learning materials you need for study success - in one app.
Get started for free
Prove that the sum of the observations of a random sample of size \(n\) from a Poisson distribution having parameter \(\theta, 0<\theta<\infty\), is a sufficient statistic for \(\theta\).
In a personal communication, LeRoy Folks noted that the inverse Gaussian pdf
$$
f\left(x ; \theta_{1}, \theta_{2}\right)=\left(\frac{\theta_{2}}{2 \pi
x^{3}}\right)^{1 / 2} \exp
\left[\frac{-\theta_{2}\left(x-\theta_{1}\right)^{2}}{2 \theta_{1}^{2}
x}\right], \quad 0
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be iid with the distribution \(N\left(\theta, \sigma^{2}\right),-\infty<\theta<\infty\). Prove that a necessary and sufficient condition that the statistics \(Z=\sum_{1}^{n} a_{i} X_{i}\) and \(Y=\sum_{1}^{n} X_{i}\), a complete sufficient statistic for \(\theta\), are independent is that \(\sum_{1}^{n} a_{i}=0\)
. Let \(Y_{1}\) and \(Y_{2}\) be two independent unbiased estimators of \(\theta\). Assume that the variance of \(Y_{1}\) is twice the variance of \(Y_{2}\). Find the constants \(k_{1}\) and \(k_{2}\) so that \(k_{1} Y_{1}+k_{2} Y_{2}\) is an unbiased estimator with the smallest possible variance for such a linear combination.
Let \(Y_{1}
What do you think about this solution?
We value your feedback to improve our textbook solutions.