/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q15E Question: Suppose that a random... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Question:Suppose that a random variable X can take only the five values\({\bf{x = 1,2,3,4,5}}\) with the following probabilities:

\(\begin{aligned}{}{\bf{f}}\left( {{\bf{1}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{3}}}{\bf{,}}\,\,\,\,{\bf{f}}\left( {{\bf{2}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{2}}}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\\{\bf{f}}\left( {{\bf{3}}\left| {\bf{\theta }} \right.} \right){\bf{ = 2\theta }}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\,\,\,{\bf{f}}\left( {{\bf{4}}\left| {\bf{\theta }} \right.} \right){\bf{ = \theta }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{2}}}{\bf{,}}\\{\bf{f}}\left( {{\bf{5}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{3}}}{\bf{.}}\end{aligned}\)

Here, the value of the parameter θ is unknown (0 ≤ θ ≤ 1).

a. Verify that the sum of the five given probabilities is 1 for every value of θ.

b. Consider an estimator δc(X) that has the following form:

\(\begin{aligned}{}{{\bf{\delta }}_{\bf{c}}}\left( {\bf{1}} \right){\bf{ = 1,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{2}} \right){\bf{ = 2}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{3}} \right){\bf{ = c,}}\\{{\bf{\delta }}_{\bf{c}}}\left( {\bf{4}} \right){\bf{ = 1}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{5}} \right){\bf{ = 0}}{\bf{.}}\end{aligned}\)

Show that for each constant, c\({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)is an unbiased estimator of θ.

c. Let\({{\bf{\theta }}_{\bf{0}}}\)be a number such that\({\bf{0 < }}{{\bf{\theta }}_{\bf{0}}}{\bf{ < 1}}\). Determine a constant\({{\bf{c}}_{\bf{0}}}\)such that when\({\bf{\theta = }}{{\bf{\theta }}_{\bf{0}}}\)the variance is smaller than the variance \({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)for every other value of c.

Short Answer

Expert verified

a. Proved. The sum of given probabilities is 1 for every value of\(\theta \)

b. Proved. For each constant c,\({\delta _c}\left( X \right)\)is an unbiased estimator of\(\theta \)

c. The value of c is \(\frac{{\left( {1 + {\theta _0}} \right)}}{3}\)

Step by step solution

01

Given information

A random variable X takes five values\(x = 1,2,3,4,5\)with probabilities,

\(\begin{aligned}{}f\left( {1\left| \theta \right.} \right) &= {\theta ^3},\,\,\,\,f\left( {2\left| \theta \right.} \right) &= {\theta ^2}\left( {1 - \theta } \right),\\f\left( {3\left| \theta \right.} \right) &= 2\theta \left( {1 - \theta } \right),\,\,\,f\left( {4\left| \theta \right.} \right) &= \theta {\left( {1 - \theta } \right)^2}\\f\left( {5\left| \theta \right.} \right) &= {\left( {1 - \theta } \right)^3}\end{aligned}\)

An estimator\({\delta _c}\left( X \right)\)the form,

\(\begin{aligned}{}{\delta _c}\left( 1 \right) &= 1,\,\,{\delta _c}\left( 2 \right) &= 2 - 2c,\,\,{\delta _c}\left( 3 \right) &= c,\\{\delta _c}\left( 4 \right) &= 1 - 2c,\,\,{\delta _c}\left( 5 \right) &= 0.\end{aligned}\)

02

(a) Checking the probabilities

\(\begin{aligned}{}f\left( {1\left| \theta \right.} \right) + f\left( {2\left| \theta \right.} \right) &= {\theta ^3} + {\theta ^2}\left( {1 - \theta } \right)\\ &= {\theta ^3} + {\theta ^2} - {\theta ^3}\\ &= {\theta ^2}\end{aligned}\)

\(\begin{aligned}{}f\left( {4\left| \theta \right.} \right) + f\left( {5\left| \theta \right.} \right) &= \theta {\left( {1 - \theta } \right)^2} + {\left( {1 - \theta } \right)^3}\\ &= {\left( {1 - \theta } \right)^2}\left( {\theta + 1 - \theta } \right)\\ &= {\left( {1 - \theta } \right)^2}\end{aligned}\)

\(f\left( {3\left| \theta \right.} \right) = 2\theta \left( {1 - \theta } \right)\)

The total of the five probabilities on the left sides of these equations equals the sum of the probabilities on the right sides, which is

\(\begin{aligned}{}{\theta ^2} + {\left( {1 - \theta } \right)^2} + 2\theta \left( {1 - \theta } \right) &= {\theta ^2} + 1 - 2\theta + {\theta ^2} + 2\theta - 2{\theta ^2}\\ &= 1\end{aligned}\)

Hence, the sum of given probabilities is 1 for every value of \(\theta \)

03

(b) Finding the unbiased estimator

\(\begin{aligned}{}{E_\theta }\left( {{\delta _c}\left( X \right)} \right) &= \sum\limits_{x = 1}^5 {{\delta _c}\left( x \right)f\left( {x\left| \theta \right.} \right)} \\ &= \left( {1 \times {\theta ^3}} \right) + \left( {\left( {2 - 2c} \right) \times {\theta ^2} \times \left( {1 - \theta } \right)} \right) + \left( {c \times 2\theta \left( {1 - \theta } \right)} \right) + \left( {\left( {1 - 2c} \right) \times \theta \times {{\left( {1 - \theta } \right)}^2}} \right) + 0\\ &= \theta \end{aligned}\)

The sum of the coefficients of\({\theta ^3}\)is 0, the sum of the coefficients of\({\theta ^2}\)is 0, the sum of coefficients of\(\theta \)is 1, and the constant term is 0.

Hence,\(E\left[ {{\delta _c}\left( X \right)} \right] = \theta \)

Hence, for each constant c, \({\delta _c}\left( X \right)\) is an unbiased estimator of \(\theta \)

04

Finding the value of c

For every value of c,

\(\begin{aligned}{}Va{r_{{\theta _0}}}\left( {{\delta _c}} \right) &= {E_{{\theta _0}}}\left( {\delta _c^2} \right) - {\left( {{E_{{\theta _0}}}\left( {{\delta _c}} \right)} \right)^2}\\ &= {E_{{\theta _0}}}\left( {\delta _c^2} \right) - {\theta ^2}\end{aligned}\)

Hence, the value of c for which\(Va{r_{{\theta _0}}}\left( {{\delta _c}} \right)\)is a minimum will be the value of c for which\({E_{{\theta _0}}}\left( {\delta _c^2} \right)\)is a minimum

Now,

\(\begin{aligned}{}{E_{{\theta _0}}}\left( {\delta _c^2} \right) &= \left( {{1^2} \times \theta _0^3} \right) + {\left( {2 - 2c} \right)^2}\left( {1 - {\theta _0}} \right) + {c^2}2{\theta _0}\left( {1 - {\theta _0}} \right) + {\left( {1 - 2c} \right)^2}{\theta _0}{\left( {1 - {\theta _0}} \right)^2} + 0\\ = 2{c^2}\left[ {2\theta _0^2\left( {1 - {\theta _0}} \right) + {\theta _0}\left( {1 - {\theta _0}} \right) + 2{\theta _0}{{\left( {1 - {\theta _0}} \right)}^2}} \right] - 4c\left[ {2\theta _0^2\left( {1 - {\theta _0}} \right) + {\theta _0}{{\left( {1 - {\theta _0}} \right)}^2}} \right] + \,\,terms\,not\,\,c\end{aligned}\)

After further simplification of the coefficients of\({c^2}\,\,and\,\,c\), we obtain the relation

\({E_{{\theta _0}}}\left( {\delta _c^2} \right) = 6{\theta _0}\left( {1 - {\theta _0}} \right){c^2} + 4{\theta _0}\left( {1 - \theta _0^2} \right)c + \,terms\,\,not\,\,involving\,\,c\)

By differentiating concerning c and setting the derivative equal to 0, it is found that the value of c for which\({E_{{\theta _0}}}\left( {\delta _c^2} \right)\)is minimum is\(c = \frac{{\left( {1 + {\theta _0}} \right)}}{3}\)

Therefore, the value of c is \(\frac{{\left( {1 + {\theta _0}} \right)}}{3}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Continue the analysis in Example 8.6.2 on page 498. Compute an interval (a, b) such that the posterior probability is 0.9 that a <μ<b. Compare this interval with the 90% confidence interval from Example 8.5.4 on page 487.

Question:Consider an infinite sequence of Bernoulli trials for which the parameter p is unknown (0 <p< 1), and suppose that sampling is continued until exactly k successes have been obtained, where k is a fixed integer (k ≥ 2). Let N denote the total number of trials that are needed to obtain the k successes. Show that the estimator (k − 1)/(N − 1) is an unbiased estimator of p.

Suppose that a random sample is to be taken from the Bernoulli distribution with an unknown parameter,p. Suppose also that it is believed that the value ofpis in the neighborhood of 0.2. How large must a random sample be taken so that\(P\left( {{\bf{|}}{{{\bf{\bar X}}}_{\bf{n}}}{\bf{ - p|}}} \right) \ge 0.75\)when p=0.2?

Suppose that \({X_1},...,{X_n}\) form a random sample from the normal distribution with mean μ and variance \({\sigma ^2}\) . Assuming that the sample size n is 16, determine the values of the following probabilities:

\(\begin{align}a.\,\,\,\,P\left( {\frac{1}{2}{\sigma ^2} \le \frac{1}{n}\sum\limits_{i = 1}^n {{{\left( {{X_i} - \mu } \right)}^2} \le 2{\sigma ^2}} } \right)\\b.\,\,\,\,P\left( {\frac{1}{2}{\sigma ^2} \le \frac{1}{n}\sum\limits_{i = 1}^n {{{\left( {{X_i} - {{\bar X}_n}} \right)}^2} \le 2{\sigma ^2}} } \right)\end{align}\)

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, \ldots ,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from the normal distribution with unknown mean μand unknown variance \({{\bf{\sigma }}^{\bf{2}}}\), and let the random variableLdenote the length of the shortest confidence interval forμthat can be constructed from the observed values in the sample. Find the value of \({\bf{E}}\left( {{{\bf{L}}^{\bf{2}}}} \right)\)for the following values of the sample sizenand the confidence coefficient\(\gamma \):

\(\begin{align}{\bf{a}}{\bf{.n = 5,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{b}}{\bf{.n = 10,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{c}}{\bf{.n = 30,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{d}}{\bf{.n = 8,}}\gamma {\bf{ = 0}}{\bf{.90}}\\{\bf{e}}{\bf{.n = 8,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{f}}{\bf{.n = 8,}}\gamma {\bf{ = 0}}{\bf{.99}}\end{align}\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.