/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q3E Question: Consider the condition... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Question: Consider the conditions of Exercise 2 again. Suppose that the prior distribution of θ is as given in Exercise 2, and suppose again that 20 items are selected at random from the shipment.

a. For what number of defective items in the sample will the mean squared error of the Bayes estimate be a maximum?

b. For what number the mean squared error of the Bayes estimate will be a minimum?

Short Answer

Expert verified

a. The number of defective items for which the mean squared error is maximum is 12.5.

b. The number of defective items for which the mean squared error is minimum is zero.

Step by step solution

01

Given information

The prior distribution of θ is as given in Exercise 2, and 20 items are selected at random from the shipment.

02

(a) Calculating the number of defective items for which MSE is maximum

Consider an experiment with only two outcomes: Success and Failure.

Let x be the number of successes observed out of n conducted experiment trials and \(\theta \)be the true proportion of successes.

Let a uniform prior distribution over the range (0,1) for\(\theta \)and assume that the likelihood of observing x successes in n trials, given the value of\(\theta \), is a binomial distribution, then the posterior distribution for\(\theta \)is beta with parameter\(\alpha = x + 1\)and\(\beta = n - x + 1\).

The Bayesian posterior estimate, the variance of the estimate of\(\theta \)and the posterior distribution of\(\theta \)are given as below

\(\begin{array}{l}E\left( {\theta |x} \right) = \frac{\alpha }{{\alpha + \beta }}\\Var\left( {\theta |x} \right) = \frac{{\alpha \beta }}{{{{\left( {\alpha + \beta } \right)}^2}\left( {\alpha + \beta + 1} \right)}}\end{array}\)

Considering the proportion\(\theta \)of defective, is items in a large shipment is unknown, the prior distribution of\(\theta \), is equal to the beta distribution with parameters\(\alpha = 5\)and\(\beta = 10\)

Consider that 20 items are selected at random from the shipment and the number of defectives items in the sample as ‘z,’ then the posterior distribution for\(\theta \)is beta with parameters

\(\alpha = 5 + z\)and

\(\begin{array}{c}\beta = n - z + \beta \\ = 30 - z\end{array}\)

Since for Bayes estimates of\(\theta \), the mean squared error of the estimate is equal to the variance of the posterior distribution. Therefore, the variance of the posterior distribution is computed as follows

\(\begin{array}{c}Var\left( {\theta |x} \right) = \frac{{\alpha \beta }}{{{{\left( {\alpha + \beta } \right)}^2}\left( {\alpha + \beta + 1} \right)}}\\ = \frac{{\left( {5 + z} \right) \times \left( {30 - z} \right)}}{{{{\left( {\left( {5 + z} \right) + \left( {30 - z} \right)} \right)}^2} \times \left( {\left( {5 + z} \right) + \left( {30 - z} \right) + 1} \right)}}\\ = \frac{{\left( {5 + z} \right) \times \left( {30 - z} \right)}}{{{{\left( {35} \right)}^2} \times \left( {36} \right)}}\end{array}\)

Now, differentiate the variance concerning ‘z’ and equate the differentiation with zero to get the maximum value of ‘z.’ So, differentiation is as follows:

\(\begin{array}{c}\frac{{dVar\left( {\theta |x} \right)}}{{dz}} = \frac{d}{{dz}}\frac{{\left( {5 + z} \right) \times \left( {30 - z} \right)}}{{{{\left( {35} \right)}^2} \times \left( {36} \right)}}\\ = \frac{1}{{{{\left( {35} \right)}^2} \times \left( {36} \right)}} \times \frac{{d\left( {150 + 25z - {z^2}} \right)}}{{dz}}\\ = \frac{{25 - 2z}}{{{{\left( {35} \right)}^2} \times 36}}\end{array}\)

To compute the maximum value of ‘z,’ equates this derivative concerning zero as follows

\(\begin{array}{c}\\\end{array}\)\(\begin{array}{c}\frac{{dVar\left( {\theta |x} \right)}}{{dz}} = 0\\\frac{{25 - 2z}}{{{{\left( {35} \right)}^2} \times \left( {36} \right)}} = 0\\25 - 2z = 0\end{array}\)

\(z = 12.5\)

Therefore, 12.5 defective items maximize the mean squared error of the Bayes estimates.

03

(b) Calculating the number of defective items for which MSE is minimum

From the above derivative\(150 + 25z - {z^2}\)is a quadratic term and the coefficient of\({z^2}\).

Therefore, the value of ‘z’ is minimum at z=0

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Identify the components of the statistical model (as defined in Definition 7.1.1) in Example 7.1.3.

Suppose that a single observation X is to be taken from the uniform distribution on the interval \(\left[ {{\bf{\theta - }}\frac{{\bf{1}}}{{\bf{2}}}{\bf{,\theta + }}\frac{{\bf{1}}}{{\bf{2}}}} \right]\), the value of θ is unknown, and the prior distribution of θ is the uniform distribution on the interval [10, 20]. If the observed value of X is 12, what is the posterior distribution of θ?

Let θ be a parameter with parameter space \({\bf{\Omega }}\) equal to an interval of real numbers (possibly unbounded). Let X have p.d.f. or p.f. \({\bf{f}}\left( {{\bf{x;\theta }}} \right)\) conditional on θ. Let T = r(X) be a statistic. Assume that T is sufficient. Prove that, for every possible prior p.d.f. for θ, the posterior p.d.f. of θ given X = x depends on x only through r(x).

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from a distribution with the p.d.f. given in Exercise 10 of Sec. 7.5. Find the M.L.E. of \({{\bf{e}}^{{\bf{ - }}\frac{{\bf{1}}}{{\bf{\theta }}}}}\).

Consider a distribution for which the pdf. or the p.f. is \(f\left( {x|\theta } \right)\) , where the parameter θ is a k-dimensional vector belonging to some parameter space\(\Omega \) . It is said that the family of distributions indexed by the values of θ in\(\Omega \) is a k-parameter exponential family, or a k-parameter Koopman-Darmois family, if \(f\left( {x|\theta } \right)\)can be written as follows for \(\theta \in \Omega \)and all values of x:

\(f\left( {x|\theta } \right) = a\left( \theta \right)b\left( x \right)\exp \left( {\sum\limits_{i = 1}^k {{c_i}\left( \theta \right){d_i}\left( x \right)} } \right)\)

Here, a and \({c_1},...,{c_k}\) are arbitrary functions of θ, and b and \({d_1},...,{d_k}\) are arbitrary functions of x. Suppose now that \({X_1},...,{X_n}\) form a random sample from a distribution which belongs to a k-parameter exponential family of this type, and define the k statistics \({T_1},...,{T_k}\) as follows:

\({T_i} = \sum\limits_{j = 1}^n {{d_i}\left( {{X_j}} \right)} \)

Show that the statistics \({T_1},...,{T_k}\)are jointly sufficient statistics for θ.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.