Chapter 7: Problem 5
Show that the first order statistic \(Y_{1}\) of a random sample of size \(n\)
from the distribution having pdf \(f(x ; \theta)=e^{-(x-\theta)},
\theta
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 7: Problem 5
Show that the first order statistic \(Y_{1}\) of a random sample of size \(n\)
from the distribution having pdf \(f(x ; \theta)=e^{-(x-\theta)},
\theta
All the tools & learning materials you need for study success - in one app.
Get started for free
The pdf depicted in Figure \(7.9 .1\) is given by
$$f_{m_{2}}(x)=e^{x}\left(1+m_{2}^{-1} e^{x}\right)^{-\left(m_{2}+1\right)},
\quad-\infty
Let a random sample of size \(n\) be taken from a distribution that has the pdf \(f(x ; \theta)=(1 / \theta) \exp (-x / \theta) I_{(0, \infty)}(x) .\) Find the mle and the MVUE of \(P(X \leq 2)\)
Prove that the sum of the observations of a random sample of size \(n\) from a Poisson distribution having parameter \(\theta, 0<\theta<\infty\), is a sufficient statistic for \(\theta\).
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample with the common pdf \(f(x)=\) \(\theta^{-1} e^{-x / \theta}\), for \(x>0\), zero elsewhere; that is, \(f(x)\) is a \(\Gamma(1, \theta)\) pdf. (a) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\theta\). (b) Determine the MVUE of \(\theta\). (c) Determine the mle of \(\theta\). (d) Often, though, this pdf is written as \(f(x)=\tau e^{-\tau x}\), for \(x>0\), zero elsewhere. Thus \(\tau=1 / \theta\). Use Theorem \(6.1 .2\) to determine the mle of \(\tau\). (e) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\tau\). Show that \((n-1) /(n X)\) is the MVUE of \(\tau=1 / \theta\). Hence, as usual the reciprocal of the mle of \(\theta\) is the mle of \(1 / \theta\), but, in this situation, the reciprocal of the MVUE of \(\theta\) is not the MVUE of \(1 / \theta\). (f) Compute the variances of each of the unbiased estimators in Parts (b) and (e).
. In a personal communication, LeRoy Folks noted that the inverse Gaussian pdf
$$f\left(x ; \theta_{1}, \theta_{2}\right)=\left(\frac{\theta_{2}}{2 \pi
x^{3}}\right)^{1 / 2} \exp
\left[\frac{-\theta_{2}\left(x-\theta_{1}\right)^{2}}{2 \theta_{1}^{2}
x}\right], \quad 0
What do you think about this solution?
We value your feedback to improve our textbook solutions.