/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q6E Question:  Suppose that \({{\bf... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\) form a random sample from the normal distribution with unknown mean \({\bf{\mu }}\) and unknown variance \({{\bf{\sigma }}^{\bf{2}}}\) . Let \(\widehat {{\bf{\sigma }}_{\bf{0}}^{\bf{2}}}\) and \(\widehat {{\bf{\sigma }}_{\bf{1}}^{\bf{2}}}\) be the two estimators of \({{\bf{\sigma }}^{\bf{2}}}\), which are defined as follows:

\(\widehat {{\bf{\sigma }}_{\bf{0}}^{\bf{2}}}{\bf{ = }}\frac{{\bf{1}}}{{\bf{n}}}{\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {\left( {{{\bf{X}}_{\bf{i}}}{\bf{ - }}\overline {\bf{X}} } \right)} ^{\bf{2}}}\)and \(\widehat {{\bf{\sigma }}_{\bf{1}}^{\bf{2}}}{\bf{ = }}\frac{{\bf{1}}}{{{\bf{n - 1}}}}{\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {\left( {{{\bf{X}}_{\bf{i}}}{\bf{ - }}\overline {\bf{X}} } \right)} ^{\bf{2}}}\)Show that the M.S.E. of \(\widehat {{\bf{\sigma }}_{\bf{0}}^{\bf{2}}}\) is smaller than the M.S.E. of \(\widehat {{\bf{\sigma }}_{\bf{1}}^{\bf{2}}}\) for all possible values of \({\bf{\mu }}\) and \({{\bf{\sigma }}^{\bf{2}}}\) .

Short Answer

Expert verified

M.S.E. of \(\widehat {\sigma _0^2}\) is smaller than the M.S.E. of \(\widehat {\sigma _1^2}\) for all possible values of \(\mu \) and \({\sigma ^2}\)

Step by step solution

01

Given information

It is given that\({X_1},{\bf{ }}.{\bf{ }}.{\bf{ }}.{\bf{ }},{\bf{ }}{X_n}\)form a random sample from the normal distribution with unknown mean\(\mu \) and unknown variance\({\sigma ^2}\).

It is also given,\(\widehat {\sigma _0^2} = \frac{1}{n}{\sum\limits_{i = 1}^n {\left( {{X_i} - \overline X } \right)} ^2}\)and\(\widehat {\sigma _1^2} = \frac{1}{{n - 1}}{\sum\limits_{i = 1}^n {\left( {{X_i} - \overline X } \right)} ^2}\).

02

Define MSE

c\(M.S.E. = E\left[ {{{\left( {\widehat \theta - \theta } \right)}^2}} \right]\)

\(\widehat \theta :{\rm{predicted}}\,\,{\rm{values}}\)

\(\theta :{\rm{observed}}\,\,{\rm{values}}\)

Here, the observed value of variance is denoted by \({\sigma ^2} = \frac{1}{n}{\sum\limits_{i = 1}^n {\left( {{X_i} - \overline X } \right)} ^2}\)

03

Calculate the M.S.E.

M.S.E. of\(\widehat {\sigma _0^2}\)

\(\begin{aligned}{c} &= E\left[ {\left( {\widehat {\sigma _0^2} - {\sigma ^2}} \right)} \right]\\ &= E\left[ {\left( {\frac{1}{n}{{\sum\limits_{i = 1}^n {\left( {{X_i} - \overline X } \right)} }^2} - \frac{1}{n}{{\sum\limits_{i = 1}^n {\left( {{X_i} - \overline X } \right)} }^2}} \right)} \right]\\ &= 0\end{aligned}\)

M.S.E. of\(\widehat {\sigma _1^2}\)

\(\begin{aligned}{c} &= E\left[ {\left( {\widehat {\sigma _1^2} - {\sigma ^2}} \right)} \right]\\ &= E\left[ {\left( {\frac{1}{{n - 1}}{{\sum\limits_{i = 1}^n {\left( {{X_i} - \overline X } \right)} }^2} - \frac{1}{n}{{\sum\limits_{i = 1}^n {\left( {{X_i} - \overline X } \right)} }^2}} \right)} \right]\\ > 0\end{aligned}\)

Hence, M.S.E. of \(\widehat {\sigma _0^2}\) is smaller than the M.S.E. of \(\widehat {\sigma _1^2}\) for all possible values of \(\mu \) and \({\sigma ^2}\) .

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question:Let θ denote the proportion of registered voters in a large city who are in favor of a certain proposition. Suppose that the value of θ is unknown, and two statisticians A and B assign to θ the following different prior p.d.f.’s\({{\bf{\xi }}_{\bf{A}}}\left( {\bf{\theta }} \right)\)and \({{\bf{\xi }}_{\bf{B}}}\left( {\bf{\theta }} \right)\) respectively:

\({{\bf{\xi }}_{\bf{A}}}\left( {\bf{\theta }} \right){\bf{ = 2\theta ,}}\,\,\,{\bf{0 < \theta < 1}}\)

\({{\bf{\xi }}_{\bf{B}}}\left( {\bf{\theta }} \right){\bf{ = 4}}{{\bf{\theta }}^{\bf{3}}}\,\,{\bf{,0 < \theta < 1}}\)

In a random sample of 1000 registered voters from the city, it is found that 710 are in favor of the proposition.

a. Find the posterior distribution that each statistician assigns to θ.

b. Find the Bayes estimate for each statistician based on the squared error loss function.

c.Show that after the opinions of the 1000 registered voters in the random sample had been obtained, the Bayes estimates for the two statisticians could not possibly differ by more than 0.002, regardless of the number in the sample who were in favor of the proposition

Prove Theorem 1.5.8. Hint:Use Exercise 12.

\(\begin{aligned}{}\Pr \left( {\bigcup\limits_{i = 1}^n {{A_i}} } \right) \le \sum\limits_{i = 1}^n {\Pr \left( {{A_i}} \right)} \\\Pr \left( {\bigcap\limits_{i = 1}^n {{A_i}} } \right) \ge 1 - \sum\limits_{i = 1}^n {\Pr \left( {{A_i}^C} \right)} \end{aligned}\)

Suppose that X has the log normal distribution with parameters 4.1 and 8. Find the distribution of3X1/2

A box contains 24 light bulbs of which four are defective. If one person selects 10 bulbs from the box ina random manner, and a second person then takes theremaining 14 bulbs, what is the probability that all fourdefective bulbs will be obtained by the same person?

Let \({A_1},{A_2}, \ldots \) be an arbitrary infinite sequence of events, and let \({B_1},{B_2}, \ldots \)be another infinite sequence of events defined as follows: \({B_1} = {A_1}\), \({B_2} = {A_1}^c \cap {A_2}\), \({B_3} = {A_1}^c \cap {A_2}^c \cap {A_3}\), \({B_4} = {A_1}^c \cap {A_2}^c \cap {A_3}^c \cap {A_4}\),…Prove that

\(\Pr \left( {\bigcup\limits_{i = 1}^n {{A_i}} } \right) = \sum\limits_{i = 1}^n {\Pr \left( {{B_i}} \right)} \)for \(n = 1,2,3, \ldots \)

and that

\(\Pr \left( {\bigcup\limits_{i = 1}^\infty {{A_i}} } \right) = \sum\limits_{i = 1}^\infty {\Pr \left( {{B_i}} \right)} \)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.