/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q5E Suppose that the number of defec... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that the number of defects in a 1200-foot roll of magnetic recording tape has a Poisson distribution for which the value of the mean θ is unknown, and the prior distribution of θ is the gamma distribution with parameters \(\alpha = 3\) and \(\beta = 1\). When five rolls of this tape are selected at random and inspected, the numbers of defects found on the rolls are 2, 2, 6, 0, and 3. If the squared error loss function is used, what is the Bayes estimate of θ?

Short Answer

Expert verified

The Bayes estimate of \(\theta \) is the mean of this distribution and is equal to \(\frac{8}{3}\).

Step by step solution

01

Given information

The number of defects in a 1200-foot roll of magnetic recording tape has a Poisson distribution for which the value of the mean θ is unknown, and the prior distribution of θ is the gamma distribution with parameters 3 and 1.

02

Calculating Bayes estimate of \(\theta \)

Consider an experiment with only two outcomes: Success and Failure.

Let x be the number of success we observe out of nconducted trails of the experiment.

Let\(\theta \)be the true proportion of successes.

If it is assigned a uniform prior distribution over the range (0,1) for\(\theta \)and assume that the likelihood of observing x successes in n trials, given the value of\(\theta \), is a binomial distribution, then the posterior distribution for\(\theta \)is beta with parameters\(\alpha = x + 1\)and\(\beta = n - x + 1\). The Bayesian posterior estimate, variance of the estimate of\(\theta \)and posterior distribution of\(\theta \)and posterior distribution of\(\theta \)are given as below:

\(\begin{array}{l}E\left( {\theta /x} \right) = \frac{\alpha }{{\alpha + \beta }}\\V\left( {\theta /x} \right) = \frac{{\alpha \beta }}{{{{\left( {\alpha + \beta } \right)}^2}\left( {\alpha + \beta + 1} \right)}}\end{array}\)

Let’s consider that the number of defects in a 1200-foot roll of magnetic recording tape has a poisson distribution for which the value of the mean\(\theta \)is unknown, a prior distribution\(\theta \), is equal to the gamma distribution with parameters\(\alpha = 3\)and\(\beta = 1\).

When five rolls of this tape are selected at random and inspected, the numbers of defects found on the rolls are 2,2,6,0 and 3.

Since,

If\({X_{1,}}..,{X_n}\)form a random sample from the Poisson distribution with parameter\(\theta \)and the the prior distribution of\(\theta \)is gamma distribution with parameters\(\alpha \)and\(\beta \)given that\({X_i} = {x_i}\left( {i = 1,2,...n} \right)\)is the gamma distribution with parameters\(\alpha + \sum\limits_{i = 1}^n {{x_i}} \)and\(\beta + n\).

Therefore, by using the above theorem the posterior distribution for\(\theta \)is gamma distribution with parameters

\(\begin{array}{c}\alpha + \sum\limits_{i = 1}^n {{x_i}} = 3 + 13\\ = 16\end{array}\)

And

\(\begin{array}{c}\beta + n = 1 + 5\\ = 6\end{array}\)

Therefore, the mean of the posterior distribution is computed as follows

\(\begin{array}{c}E\left( {\theta |{X_i} = {x_i}\left( {i = 1,...,n} \right)} \right) = \frac{{16}}{6}\\ = \frac{8}{3}\end{array}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question: Suppose that a certain large population contains k different types of individuals (k ≥ 2), and let \({{\bf{\theta }}_{\bf{i}}}\)denote the proportion of individuals of type i, for i = 1,...,k. Here, 0 ≤ \({\theta _i}\)≤ 1 and\({{\bf{\theta }}_{\bf{1}}}{\bf{ + }}...{\bf{ + }}{{\bf{\theta }}_{\bf{k}}}{\bf{ = 1}}\). Suppose also that in a random sample of n individuals from this population, exactly ni individuals are of type i, where\({{\bf{n}}_{\bf{1}}}{\bf{ + }}...{\bf{ + }}{{\bf{n}}_{\bf{k}}}{\bf{ = n}}\). Find the M.L.E.’s of \({{\bf{\theta }}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{\theta }}_{\bf{n}}}\)

In Examples 7.1.4 and 5.7.8 (page 323), identify the components of the statistical model as defined in Definition 7.1.1.

Question: Prove that the method of moments estimators of themean and variance of a normal distribution are also the M.L.E.’s.

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\)form a random sample ofsize n from the uniform distribution on the interval \(\left( {{\bf{0,\theta }}} \right)\),where the value of \({\bf{\theta }}\) is unknown. Show that the sequence of M.L.E.’s of \({\bf{\theta }}\) is a consistent sequence.

Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from a distribution with the following p.d.f.:

\({\bf{f}}\left( {{\bf{x}}\left| {{\bf{\beta ,\theta }}} \right.} \right){\bf{ = }}\left\{ {\begin{align}{}{{\bf{\beta }}{{\bf{e}}^{{\bf{ - \beta }}\left( {{\bf{x - \theta }}} \right)}}}&{{\bf{for}}\,\,{\bf{x}} \ge {\bf{\theta }}}\\{\bf{0}}&{{\bf{otherwise,}}}\end{align}} \right.\)

where\({\bf{\beta }}\,\,{\bf{and}}\,\,{\bf{\theta }}\,\,{\bf{are}}\,\,{\bf{unknown}}\,\,\left( {{\bf{\beta > 0,}} - \infty {\bf{ < \theta < }}\infty } \right)\). Determine a pair of jointly sufficient statistics.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.