Chapter 7: Problem 9
Let \(X_{1}, \ldots, X_{n}\) be iid with pdf \(f(x ; \theta)=1 /(3
\theta),-\theta
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 9
Let \(X_{1}, \ldots, X_{n}\) be iid with pdf \(f(x ; \theta)=1 /(3
\theta),-\theta
All the tools & learning materials you need for study success - in one app.
Get started for free
We consider a random sample \(X_{1}, X_{2}, \ldots, X_{n}\) from a distribution
with pdf \(f(x ; \theta)=(1 / \theta) \exp (-x / \theta), 0
Show that the sum of the observations of a random sample of size \(n\) from a
gamma distribution that has pdf \(f(x ; \theta)=(1 / \theta) e^{-x / \theta},
0
Let \(\bar{X}\) denote the mean of the random sample \(X_{1}, X_{2}, \ldots, X_{n}\) from a gammatype distribution with parameters \(\alpha>0\) and \(\beta=\theta \geq 0 .\) Compute \(E\left[X_{1} \mid \bar{x}\right]\) Hint: \(\quad\) Can you find directly a function \(\psi(X)\) of \(X\) such that \(E[\psi(X)]=\theta ?\) Is \(E\left(X_{1} \mid \bar{x}\right)=\psi(\bar{x}) ?\) Why?
Let \(X_{1}, X_{2}, \ldots, X_{n}, n>2\), be a random sample from the binomial distribution \(b(1, \theta)\). (a) Show that \(Y_{1}=X_{1}+X_{2}+\cdots+X_{n}\) is a complete sufficient statistic for \(\theta\). (b) Find the function \(\varphi\left(Y_{1}\right)\) which is the MVUE of \(\theta\). (c) Let \(Y_{2}=\left(X_{1}+X_{2}\right) / 2\) and compute \(E\left(Y_{2}\right)\). (d) Determine \(E\left(Y_{2} \mid Y_{1}=y_{1}\right)\).
Let \(\left(X_{1}, Y_{1}\right),\left(X_{2}, Y_{2}\right), \ldots,\left(X_{n}, Y_{n}\right)\) denote a random sample of size \(n\) from a bivariate normal distribution with means \(\mu_{1}\) and \(\mu_{2}\), positive variances \(\sigma_{1}^{2}\) and \(\sigma_{2}^{2}\), and correlation coefficient \(\rho .\) Show that \(\sum_{1}^{n} X_{i}, \sum_{1}^{n} Y_{i}, \sum_{1}^{n} X_{i}^{2}, \sum_{1}^{n} Y_{i}^{2}\), and \(\sum_{1}^{n} X_{i} Y_{i}\) are joint complete sufficient statistics for the five parameters. Are \(\bar{X}=\) \(\sum_{1}^{n} X_{i} / n, \bar{Y}=\sum_{1}^{n} Y_{i} / n, S_{1}^{2}=\sum_{1}^{n}\left(X_{i}-\bar{X}\right)^{2} /(n-1), S_{2}^{2}=\sum_{1}^{n}\left(Y_{i}-\bar{Y}\right)^{2} /(n-1)\) and \(\sum_{1}^{n}\left(X_{i}-X\right)\left(Y_{i}-\bar{Y}\right) /(n-1) S_{1} S_{2}\) also joint complete sufficient statistics for these parameters?
What do you think about this solution?
We value your feedback to improve our textbook solutions.