Chapter 11: Problem 9
Let \(Y_{4}\) be the largest order statistic of a sample of size \(n=4\) from a
distribution with uniform pdf \(f(x ; \theta)=1 / \theta, 0
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 11: Problem 9
Let \(Y_{4}\) be the largest order statistic of a sample of size \(n=4\) from a
distribution with uniform pdf \(f(x ; \theta)=1 / \theta, 0
All the tools & learning materials you need for study success - in one app.
Get started for free
The following amounts are bets on horses \(A, B, C, D\), and \(E\) to win. $$ \begin{array}{cr} \text { Horse } & \text { Amount } \\ \hline A & \$ 600,000 \\ B & \$ 200,000 \\ C & \$ 100,000 \\ D & \$ 75,000 \\ E & \$ 25,000 \\ \hline \text { Total } & \$ 1,000,000 \end{array} $$ Suppose the track wants to take \(20 \%\) off the top, namely, \(\$ 200,000\). Determine the payoff for winning with a \(\$ 2\) bet on each of the five horses. (In this exercise, we do not concern ourselves with "place" and "show.") Hint: Figure out what would be a fair payoff so that the track does not take any money (that is, the track's take is zero), and then compute \(80 \%\) of those payoffs.
Let \(X_{1}, X_{2}\) be a random sample from a Cauchy distribution with pdf
$$
f\left(x ; \theta_{1}, \theta_{2}\right)=\left(\frac{1}{\pi}\right)
\frac{\theta_{2}}{\theta_{2}^{2}+\left(x-\theta_{1}\right)^{2}},
\quad-\infty
Consider the hierarchical Bayes model $$ \begin{aligned} Y & \sim b(n, p), \quad 0
0 \\ \theta & \sim \Gamma(1, a), \quad a>0 \text { is specified. } \end{aligned} $$ (a) Assuming squared-error loss, write the Bayes estimate of \(p\) as in expression (11.5.3). Integrate relative to \(\theta\) first. Show that both the numerator and denominator are expectations of a beta distribution with parameters \(y+1\) and \(n-y+1\). (b) Recall the discussion around expression (11.4.2). Write an explicit Monte Carlo algorithm to obtain the Bayes estimate in part (a).
Suppose \(Y\) has a \(\Gamma(1,1)\) distribution while \(X\) given \(Y\) has the
conditional pdf
$$
f(x \mid y)=\left\\{\begin{array}{ll}
e^{-(x-y)} & 0
Let \(X_{1}, X_{2}, \ldots, X_{n}\) denote a random sample from a Poisson distribution with mean \(\theta, 0<\theta<\infty\). Let \(Y=\sum_{1}^{n} X_{i} .\) Use the loss function \(\mathcal{L}[\theta, \delta(y)]=\) \([\theta-\delta(y)]^{2} .\) Let \(\theta\) be an observed value of the random variable \(\Theta .\) If \(\Theta\) has the prior \(\operatorname{pdf} h(\theta)=\theta^{\alpha-1} e^{-\theta / \beta} / \Gamma(\alpha) \beta^{\alpha}\), for \(0<\theta<\infty\), zero elsewhere, where \(\alpha>0, \beta>0\) are known numbers, find the Bayes solution \(\delta(y)\) for a point estimate for \(\theta\).
What do you think about this solution?
We value your feedback to improve our textbook solutions.