Chapter 7: Problem 1
Let \(Y_{1}
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 1
Let \(Y_{1}
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Poisson distribution with mean \(\theta\). Find the conditional expectation \(E\left(X_{1}+2 X_{2}+3 X_{3} \mid \sum_{1}^{n} X_{i}\right)\)
Let \(X_{1}, \ldots, X_{n}\) be a random sample from a distribution of the continuous type with cdf \(F(x)\). Let \(\theta=P\left(X_{1} \leq a\right)=F(a)\), where \(a\) is known. Show that the proportion \(n^{-1} \\#\left\\{X_{i} \leq a\right\\}\) is the MVUE of \(\theta\).
Let \(X_{1}, X_{2}, \ldots, X_{n}, n>2\), be a random sample from the binomial distribution \(b(1, \theta)\). (a) Show that \(Y_{1}=X_{1}+X_{2}+\cdots+X_{n}\) is a complete sufficient statistic for \(\theta\). (b) Find the function \(\varphi\left(Y_{1}\right)\) which is the MVUE of \(\theta\). (c) Let \(Y_{2}=\left(X_{1}+X_{2}\right) / 2\) and compute \(E\left(Y_{2}\right)\). (d) Determine \(E\left(Y_{2} \mid Y_{1}=y_{1}\right)\).
Let \(X\) and \(Y\) be random variables such that \(E\left(X^{k}\right)\) and \(E\left(Y^{k}\right) \neq 0\) exist for \(k=1,2,3, \ldots\) If the ratio \(X / Y\) and its denominator \(Y\) are independent, prove that \(E\left[(X / Y)^{k}\right]=E\left(X^{k}\right) / E\left(Y^{k}\right), k=1,2,3, \ldots\) Hint: Write \(E\left(X^{k}\right)=E\left[Y^{k}(X / Y)^{k}\right]\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a Poisson distribution with mean \(\theta>0\). (a) Statistician \(A\) observes the sample to be the values \(x_{1}, x_{2}, \ldots, x_{n}\) with sum \(y=\sum x_{i} .\) Find the mle of \(\theta\). (b) Statistician \(B\) loses the sample values \(x_{1}, x_{2}, \ldots, x_{n}\) but remembers the sum \(y_{1}\) and the fact that the sample arose from a Poisson distribution. Thus
What do you think about this solution?
We value your feedback to improve our textbook solutions.