/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q12E Suppose a certain type of fertil... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose a certain type of fertilizer has an expected yield per acre of \({{\rm{\mu }}_{\rm{2}}}\)with variance \({{\rm{\sigma }}^{\rm{2}}}\)whereas the expected yield for a second type of fertilizer is with the same variance \({{\rm{\sigma }}^{\rm{2}}}\).Let \({\rm{S}}_{\rm{1}}^{\rm{2}}\) and \({\rm{S}}_{\rm{2}}^{\rm{2}}\)denote the sample variances of yields based on sample sizes \({{\rm{n}}_{\rm{1}}}\)and \({{\rm{n}}_{\rm{2}}}\),respectively, of the two fertilizers. Show that the pooled (combined) estimator

\({{\rm{\hat \sigma }}^{\rm{2}}}{\rm{ = }}\frac{{\left( {{{\rm{n}}_{\rm{1}}}{\rm{ - 1}}} \right){\rm{S}}_{\rm{1}}^{\rm{2}}{\rm{ + }}\left( {{{\rm{n}}_{\rm{2}}}{\rm{ - 1}}} \right){\rm{S}}_{\rm{2}}^{\rm{2}}}}{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}\)

is an unbiased estimator of \({{\rm{\sigma }}^{\rm{2}}}\)

Short Answer

Expert verified

It is unbiased.

Step by step solution

01

Concept introduction

An accurate statistic that is used to approximate a population parameter is known as an unbiased estimator. In this context, "accurate" means neither an overestimate nor an underestimate. If an overestimate or underestimate occurs, the difference's mean is referred to as a "bias."

02

Finding whether it is unbiased or not

The estimator is unbiased if the expected value of the supplied estimator is\({{\rm{\sigma }}^{\rm{2}}}\)As a result, the following is true:

\(\begin{array}{c}{\rm{E}}\left( {\frac{{\left( {{{\rm{n}}_{\rm{1}}}{\rm{ - 1}}} \right){\rm{S}}_{\rm{1}}^{\rm{2}}{\rm{ + }}\left( {{{\rm{n}}_{\rm{2}}}{\rm{ - 1}}} \right){\rm{S}}_{\rm{2}}^{\rm{2}}}}{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}} \right)\\{\rm{ = }}\frac{{{{\rm{n}}_{\rm{1}}}{\rm{ - 1}}}}{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}{\rm{E}}{\left( {{{\rm{S}}_{\rm{1}}}} \right)^{\rm{2}}}{\rm{ + }}\frac{{{{\rm{n}}_{\rm{2}}}{\rm{ - 1}}}}{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}{\rm{E}}\left( {{\rm{S}}_{\rm{2}}^{\rm{2}}} \right)\\\mathop {\rm{ = }}\limits^{{\rm{(1)}}} \frac{{{{\rm{n}}_{\rm{1}}}{\rm{ - 1}}}}{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}{{\rm{\sigma }}^{\rm{2}}}{\rm{ + }}\frac{{{{\rm{n}}_{\rm{2}}}{\rm{ - 1}}}}{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}{{\rm{\sigma }}^{\rm{2}}}\\{\rm{ = }}{{\rm{\sigma }}^{\rm{2}}}\frac{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}{{{{\rm{n}}_{\rm{1}}}{\rm{ + }}{{\rm{n}}_{\rm{2}}}{\rm{ - 2}}}}\\{\rm{ = }}{{\rm{\sigma }}^{\rm{2}}}{\rm{,}}\end{array}\)

(1): the estimators \({S_1}\)and \({S_2}\)are unbiased variance estimators,

Hence, it is unbiased.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Each of \({\rm{n}}\) specimens is to be weighed twice on the same scale. Let \({{\rm{X}}_{\rm{i}}}\) and \({{\rm{Y}}_{\rm{i}}}\) denote the two observed weights for the \({\rm{ith}}\) specimen. Suppose\({{\rm{X}}_{\rm{i}}}\) and \({{\rm{Y}}_{\rm{i}}}\) are independent of one another, each normally distributed with mean value \({{\rm{\mu }}_{\rm{i}}}\) (the true weight of specimen \({\rm{i}}\)) and variance \({{\rm{\sigma }}^{\rm{2}}}\) . a. Show that the maximum likelihood estimator of \({{\rm{\sigma }}^{\rm{2}}}\) is \({\widehat {\rm{\sigma }}^{\rm{2}}}{\rm{ = \Sigma }}{\left( {{{\rm{X}}_{\rm{i}}}{\rm{ - }}{{\rm{Y}}_{\rm{i}}}} \right)^{\rm{2}}}{\rm{/(4n)}}\). (Hint: If \({\rm{\bar z = }}\left( {{{\rm{z}}_{\rm{1}}}{\rm{ + }}{{\rm{z}}_{\rm{2}}}} \right){\rm{/2}}\), then \({\rm{\Sigma }}{\left( {{{\rm{z}}_{\rm{i}}}{\rm{ - \bar z}}} \right)^{\rm{2}}}{\rm{ = }}{\left( {{{\rm{z}}_{\rm{1}}}{\rm{ - }}{{\rm{z}}_{\rm{2}}}} \right)^{\rm{2}}}{\rm{/2}}\)) b. Is the mle \({{\rm{\hat \sigma }}^{\rm{2}}}\) an unbiased estimator of \({{\rm{\sigma }}^{\rm{2}}}\)? Find an unbiased estimator of \({{\rm{\sigma }}^{\rm{2}}}\). (Hint: For any \({\rm{rv Z,E}}\left( {{{\rm{Z}}^{\rm{2}}}} \right){\rm{ = V(Z) + (E(Z)}}{{\rm{)}}^{\rm{2}}}\). Apply this to \({\rm{Z = }}{{\rm{X}}_{\rm{i}}}{\rm{ - }}{{\rm{Y}}_{\rm{i}}}\).)

a. Let \({{\rm{X}}_{\rm{1}}}{\rm{, \ldots ,}}{{\rm{X}}_{\rm{n}}}\) be a random sample from a uniform distribution on \({\rm{(0,\theta )}}\). Then the mle of \({\rm{\theta }}\) is \({\rm{\hat \theta = Y = max}}\left( {{{\rm{X}}_{\rm{i}}}} \right)\). Use the fact that \({\rm{Y}} \le {\rm{y}}\) if each \({{\rm{X}}_{\rm{i}}} \le {\rm{y}}\) to derive the cdf of Y. Then show that the pdf of \({\rm{Y = max}}\left( {{{\rm{X}}_{\rm{i}}}} \right)\) is \({{\rm{f}}_{\rm{Y}}}{\rm{(y) = }}\left\{ {\begin{array}{*{20}{c}}{\frac{{{\rm{n}}{{\rm{y}}^{{\rm{n - 1}}}}}}{{{{\rm{\theta }}^{\rm{n}}}}}}&{{\rm{0}} \le {\rm{y}} \le {\rm{\theta }}}\\{\rm{0}}&{{\rm{ otherwise }}}\end{array}} \right.\)

b. Use the result of part (a) to show that the mle is biased but that \({\rm{(n + 1)}}\)\({\rm{max}}\left( {{{\rm{X}}_{\rm{i}}}} \right){\rm{/n}}\) is unbiased.

A diagnostic test for a certain disease is applied to\({\rm{n}}\)individuals known to not have the disease. Let\({\rm{X = }}\)the number among the\({\rm{n}}\)test results that are positive (indicating presence of the disease, so\({\rm{X}}\)is the number of false positives) and\({\rm{p = }}\)the probability that a disease-free individual's test result is positive (i.e.,\({\rm{p}}\)is the true proportion of test results from disease-free individuals that are positive). Assume that only\({\rm{X}}\)is available rather than the actual sequence of test results.

a. Derive the maximum likelihood estimator of\({\rm{p}}\). If\({\rm{n = 20}}\)and\({\rm{x = 3}}\), what is the estimate?

b. Is the estimator of part (a) unbiased?

c. If\({\rm{n = 20}}\)and\({\rm{x = 3}}\), what is the mle of the probability\({{\rm{(1 - p)}}^{\rm{5}}}\)that none of the next five tests done on disease-free individuals are positive?

Consider a random sample \({{\rm{X}}_{\rm{1}}}{\rm{, \ldots ,}}{{\rm{X}}_{\rm{n}}}\) from the pdf

\({\rm{f(x;\theta ) = }}{\rm{.5(1 + \theta x)}}\quad {\rm{ - 1£ x£ 1}}\)

where \({\rm{ - 1£ \theta £ 1}}\) (this distribution arises in particle physics). Show that \({\rm{\hat \theta = 3\bar X}}\) is an unbiased estimator of\({\rm{\theta }}\). (Hint: First determine\({\rm{\mu = E(X) = E(\bar X)}}\).)

The article from which the data in Exercise 1 was extracted also gave the accompanying strength observations for cylinders:

\(\begin{array}{l}\begin{array}{*{20}{r}}{{\rm{6}}{\rm{.1}}}&{{\rm{5}}{\rm{.8}}}&{{\rm{7}}{\rm{.8}}}&{{\rm{7}}{\rm{.1}}}&{{\rm{7}}{\rm{.2}}}&{{\rm{9}}{\rm{.2}}}&{{\rm{6}}{\rm{.6}}}&{{\rm{8}}{\rm{.3}}}&{{\rm{7}}{\rm{.0}}}&{{\rm{8}}{\rm{.3}}}\\{{\rm{7}}{\rm{.8}}}&{{\rm{8}}{\rm{.1}}}&{{\rm{7}}{\rm{.4}}}&{{\rm{8}}{\rm{.5}}}&{{\rm{8}}{\rm{.9}}}&{{\rm{9}}{\rm{.8}}}&{{\rm{9}}{\rm{.7}}}&{{\rm{14}}{\rm{.1}}}&{{\rm{12}}{\rm{.6}}}&{{\rm{11}}{\rm{.2}}}\end{array}\\\begin{array}{*{20}{l}}{{\rm{7}}{\rm{.8}}}&{{\rm{8}}{\rm{.1}}}&{{\rm{7}}{\rm{.4}}}&{{\rm{8}}{\rm{.5}}}&{{\rm{8}}{\rm{.9}}}&{{\rm{9}}{\rm{.8}}}&{{\rm{9}}{\rm{.7}}}&{{\rm{14}}{\rm{.1}}}&{{\rm{12}}{\rm{.6}}}&{{\rm{11}}{\rm{.2}}}\end{array}\end{array}\)

Prior to obtaining data, denote the beam strengths by X1, … ,Xm and the cylinder strengths by Y1, . . . , Yn. Suppose that the Xi ’s constitute a random sample from a distribution with mean m1 and standard deviation s1 and that the Yi ’s form a random sample (independent of the Xi ’s) from another distribution with mean m2 and standard deviation\({{\rm{\sigma }}_{\rm{2}}}\).

a. Use rules of expected value to show that \({\rm{\bar X - \bar Y}}\)is an unbiased estimator of \({{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}\). Calculate the estimate for the given data.

b. Use rules of variance from Chapter 5 to obtain an expression for the variance and standard deviation (standard error) of the estimator in part (a), and then compute the estimated standard error.

c. Calculate a point estimate of the ratio \({{\rm{\sigma }}_{\rm{1}}}{\rm{/}}{{\rm{\sigma }}_{\rm{2}}}\)of the two standard deviations.

d. Suppose a single beam and a single cylinder are randomly selected. Calculate a point estimate of the variance of the difference \({\rm{X - Y}}\) between beam strength and cylinder strength.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.