/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q17SE Suppose that X1,…â¶Ä¦., Xn  f... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that X1,…â¶Ä¦.,Xn form a random sample from the Bernoulli distribution with unknown parameter p. Show that the variance of every unbiased estimator of (1-p)2 must be at least 4p(1-p)3/n.

Short Answer

Expert verified

The variance of every unbiased estimator must be at least

\begin{aligned}\frac{{4p{{\left({1-p}\right)}^3}}}{n}\end{aligned}

Step by step solution

01

Given the information

Suppose that X1,…â¶Ä¦.,Xn from a random sample from the Bernoulli distribution with an unknown parameter p.

02

Showing the variance

Here, (1-p)2 is an unbiased estimator.

So,

m(p) = (1-p)2

Then,

m’(p) = -2(1-p)

°Ú³¾â€™(±è)±Õ2 = (-2(1-p))2

= 4(1-p)2

So, the fisher information for Bernoulli distribution is,

L (p) = 1/[p(1-p)]

Therefore, if the T is an unbiased estimator of m(p), it follows the relation,

Vare (T) ≥ °Ú³¾â€™(±è)±Õ2/ nl(p)

So,

Var (T) ≥ 4(1-p)2/ n1/p(1- p)

= 4p(1-p)3/n

That is,

Var (T) ≥ 4p(1-p)3/n

Hence, proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Find the mode of theχ2 distribution withmdegrees of

freedom(m=1,2, . . .).

Continue the analysis in Example 8.6.2 on page 498. Compute an interval (a, b) such that the posterior probability is 0.9 that a <μ<b. Compare this interval with the 90% confidence interval from Example 8.5.4 on page 487.

Question:Suppose that a random variable X can take only the five values\({\bf{x = 1,2,3,4,5}}\) with the following probabilities:

\(\begin{aligned}{}{\bf{f}}\left( {{\bf{1}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{3}}}{\bf{,}}\,\,\,\,{\bf{f}}\left( {{\bf{2}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{{\bf{\theta }}^{\bf{2}}}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\\{\bf{f}}\left( {{\bf{3}}\left| {\bf{\theta }} \right.} \right){\bf{ = 2\theta }}\left( {{\bf{1}} - {\bf{\theta }}} \right){\bf{,}}\,\,\,{\bf{f}}\left( {{\bf{4}}\left| {\bf{\theta }} \right.} \right){\bf{ = \theta }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{2}}}{\bf{,}}\\{\bf{f}}\left( {{\bf{5}}\left| {\bf{\theta }} \right.} \right){\bf{ = }}{\left( {{\bf{1}} - {\bf{\theta }}} \right)^{\bf{3}}}{\bf{.}}\end{aligned}\)

Here, the value of the parameter θ is unknown (0 ≤ θ ≤ 1).

a. Verify that the sum of the five given probabilities is 1 for every value of θ.

b. Consider an estimator δc(X) that has the following form:

\(\begin{aligned}{}{{\bf{\delta }}_{\bf{c}}}\left( {\bf{1}} \right){\bf{ = 1,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{2}} \right){\bf{ = 2}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{3}} \right){\bf{ = c,}}\\{{\bf{\delta }}_{\bf{c}}}\left( {\bf{4}} \right){\bf{ = 1}} - {\bf{2c,}}\,\,{{\bf{\delta }}_{\bf{c}}}\left( {\bf{5}} \right){\bf{ = 0}}{\bf{.}}\end{aligned}\)

Show that for each constant, c\({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)is an unbiased estimator of θ.

c. Let\({{\bf{\theta }}_{\bf{0}}}\)be a number such that\({\bf{0 < }}{{\bf{\theta }}_{\bf{0}}}{\bf{ < 1}}\). Determine a constant\({{\bf{c}}_{\bf{0}}}\)such that when\({\bf{\theta = }}{{\bf{\theta }}_{\bf{0}}}\)the variance is smaller than the variance \({{\bf{\delta }}_{\bf{c}}}\left( {\bf{X}} \right)\)for every other value of c.

Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}{{\bf{X}}_{\bf{2}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from the uniform distribution on the interval\(\left( {{\bf{0,1}}} \right)\), and let\({\bf{W}}\)denote the range of the sample, as defined in Example 3.9.7. Also, let\({{\bf{g}}_{\bf{n}}}\left( {\bf{x}} \right)\)denote the p.d.f of the random

variable\({\bf{2n}}\left( {{\bf{1 - W}}} \right)\), and let\({\bf{g}}\left( {\bf{x}} \right)\)denote the p.d.f of the\({\chi ^{\bf{2}}}\)distribution with four degrees of freedom. Show that

\(\mathop {{\bf{lim}}}\limits_{{\bf{n}} \to \infty } {{\bf{g}}_{\bf{n}}}\left( {\bf{x}} \right){\bf{ = g}}\left( {\bf{x}} \right)\) for\({\bf{x > 0}}\).

Question:Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from the uniform distribution on the interval (0, θ), where the value of the parameter θ is unknown; and let\({{\bf{Y}}_{\bf{n}}}{\bf{ = max}}\left( {{{\bf{X}}_{\bf{1}}}{\bf{,}}...{{\bf{X}}_{\bf{n}}}} \right)\). Show that\(\left( {\frac{{\left( {{\bf{n + 1}}} \right)}}{{\bf{n}}}} \right){{\bf{Y}}_{\bf{n}}}\) is an unbiased estimator of θ.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.