Chapter 7: Problem 2
Prove that the sum of the observations of a random sample of size \(n\) from a Poisson distribution having parameter \(\theta, 0<\theta<\infty\), is a sufficient statistic for \(\theta\).
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 2
Prove that the sum of the observations of a random sample of size \(n\) from a Poisson distribution having parameter \(\theta, 0<\theta<\infty\), is a sufficient statistic for \(\theta\).
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a distribution with
pdf \(f(x ; \theta)=\theta^{2} x e^{-\theta x}, 0
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a distribution that is \(N(\theta, 1),-\infty<\theta<\infty\). Find the MVUE of \(\theta^{2}\). Hint: \(\quad\) First determine \(E\left(\bar{X}^{2}\right)\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample with the common pdf \(f(x)=\) \(\theta^{-1} e^{-x / \theta}\), for \(x>0\), zero elsewhere; that is, \(f(x)\) is a \(\Gamma(1, \theta)\) pdf. (a) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\theta\). (b) Determine the MVUE of \(\theta\). (c) Determine the mle of \(\theta\). (d) Often, though, this pdf is written as \(f(x)=\tau e^{-\tau x}\), for \(x>0\), zero elsewhere. Thus \(\tau=1 / \theta\). Use Theorem \(6.1 .2\) to determine the mle of \(\tau\). (e) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\tau\). Show that \((n-1) /(n \bar{X})\) is the MVUE of \(\tau=1 / \theta\). Hence, as usual, the reciprocal of the mle of \(\theta\) is the mle of \(1 / \theta\), but, in this situation, the reciprocal of the MVUE of \(\theta\) is not the MVUE of \(1 / \theta\). (f) Compute the variances of each of the unbiased estimators in parts (b) and (e).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) represent a random sample from the discrete distribution having the pmf $$ f(x ; \theta)=\left\\{\begin{array}{ll} \theta^{x}(1-\theta)^{1-x} & x=0,1,0<\theta<1 \\ 0 & \text { elsewhere } \end{array}\right. $$ Show that \(Y_{1}=\sum_{1}^{n} X_{i}\) is a complete sufficient statistic for \(\theta .\) Find the unique function of \(Y_{1}\) that is the MVUE of \(\theta\). Hint: \(\quad\) Display \(E\left[u\left(Y_{1}\right)\right]=0\), show that the constant term \(u(0)\) is equal to zero, divide both members of the equation by \(\theta \neq 0\), and repeat the argument.
If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from a distribution that has a pdf which is a regular case of the exponential class, show that the pdf of \(Y_{1}=\sum_{1}^{n} K\left(X_{i}\right)\) is of the form \(f_{Y_{1}}\left(y_{1} ; \theta\right)=R\left(y_{1}\right) \exp \left[p(\theta) y_{1}+n q(\theta)\right]\). Hint: \(\quad\) Let \(Y_{2}=X_{2}, \ldots, Y_{n}=X_{n}\) be \(n-1\) auxiliary random variables. Find the joint pdf of \(Y_{1}, Y_{2}, \ldots, Y_{n}\) and then the marginal pdf of \(Y_{1}\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.