Chapter 7: Problem 7
Let \(X\) have the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for
\(-\theta
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 7
Let \(X\) have the pdf \(f_{X}(x ; \theta)=1 /(2 \theta)\), for
\(-\theta
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from the uniform
distribution with pdf \(f\left(x ; \theta_{1}, \theta_{2}\right)=1 /\left(2
\theta_{2}\right), \theta_{1}-\theta_{2}
Let \(Y_{1}
Let \(X\) and \(Y\) be random variables such that \(E\left(X^{k}\right)\) and \(E\left(Y^{k}\right) \neq 0\) exist for \(k=1,2,3, \ldots\) If the ratio \(X / Y\) and its denominator \(Y\) are independent, prove that \(E\left[(X / Y)^{k}\right]=E\left(X^{k}\right) / E\left(Y^{k}\right), k=1,2,3, \ldots\) Hint: \(\quad\) Write \(E\left(X^{k}\right)=E\left[Y^{k}(X / Y)^{k}\right]\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from each of the following distributions involving the parameter \(\theta .\) In each case find the mle of \(\theta\) and show that it is a sufficient statistic for \(\theta\) and hence a minimal sufficient statistic. (a) \(b(1, \theta)\), where \(0 \leq \theta \leq 1\). (b) Poisson with mean \(\theta>0\). (c) Gamma with \(\alpha=3\) and \(\beta=\theta>0\). (d) \(N(\theta, 1)\), where \(-\infty<\theta \leq \infty\). (e) \(N(0, \theta)\), where \(0<\theta<\infty\).
Consider the situation of the last exercise, but suppose we have the following two independent random samples: (1) \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample with the common pdf \(f_{X}(x)=\theta^{-1} e^{-x / \theta}\), for \(x>0\), zero elsewhere, and \((2) Y_{1}, Y_{2}, \ldots, Y_{n}\) is a random sample with common pdf \(f_{Y}(y)=\theta e^{-\theta y}\), for \(y \geq 0\), zero elsewhere. The last exercise suggests that, for some constant \(c, Z=c \bar{X} / \bar{Y}\) might be an unbiased estimator of \(\theta^{2}\). Find this constant \(c\) and the variance of \(Z\). Hint: Show that \(\bar{X} /\left(\theta^{2} \bar{Y}\right)\) has an \(F\) -distribution.
What do you think about this solution?
We value your feedback to improve our textbook solutions.