/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q15SE Let \({{\bf{X}}_{{\bf{1,}}}}{\bf... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \({{\bf{X}}_{{\bf{1,}}}}{\bf{ }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\)be i.i.d. random variables having the normal distribution with mean \({\bf{\mu }}\) and variance\({{\bf{\sigma }}^{\bf{2}}}\). Define\(\overline {{{\bf{X}}_{\bf{n}}}} {\bf{ = }}\frac{{\bf{1}}}{{\bf{n}}}\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{{\bf{X}}_{\bf{i}}}} \) , the sample mean. In this problem, we shall find the conditional distribution of each \({{\bf{X}}_{\bf{i}}}\)given\(\overline {{{\bf{X}}_{\bf{n}}}} \).

a.Show that \({{\bf{X}}_{\bf{i}}}\)and\(\overline {{{\bf{X}}_{\bf{n}}}} \) have the bivariate normal distribution with both means \({\bf{\mu }}\), variances\({{\bf{\sigma }}^{\bf{2}}}{\rm{ }}{\bf{and}}\,\,\frac{{{{\bf{\sigma }}^{\bf{2}}}}}{{\bf{n}}}\),and correlation\(\frac{{\bf{1}}}{{\sqrt {\bf{n}} }}\).

Hint: Let\({\bf{Y = }}\sum\limits_{{\bf{j}} \ne {\bf{i}}} {{{\bf{X}}_{\bf{j}}}} \).

Now show that Y and \({{\bf{X}}_{\bf{i}}}\) are independent normal and \({{\bf{X}}_{\bf{n}}}\)and \({{\bf{X}}_{\bf{i}}}\) are linear combinations of Y and \({{\bf{X}}_{\bf{i}}}\) .

b.Show that the conditional distribution of \({{\bf{X}}_{\bf{i}}}\) given\(\overline {{{\bf{X}}_{\bf{n}}}} {\bf{ = }}\overline {{{\bf{x}}_{\bf{n}}}} \)\(\) is normal with mean \(\overline {{{\bf{x}}_{\bf{n}}}} \) and variance \({{\bf{\sigma }}^{\bf{2}}}\left( {{\bf{1 - }}\frac{{\bf{1}}}{{\bf{n}}}} \right)\).

Short Answer

Expert verified

a. The proof has been established

b.The proof has been established

Step by step solution

01

Given information

\({X_{1,}}{\bf{ }}.{\bf{ }}.{\bf{ }}.{\bf{ }},{\bf{ }}{X_n}\)are i.i.d. normal variables, with mean \(\mu \) and variance \({\sigma ^2}\).The sample mean is defined as\(\overline {{X_n}} = \frac{1}{n}\sum\limits_{i = 1}^n {{X_i}} \)

02

Define a new variable and find the bivariate distribution.

a.

Let,\(Y = \sum\limits_{j \ne i} {{X_j}} \)

We know that summation of normal variate is also a normal variable. Therefore, Y also follows normal distribution,\(Y \sim N\left( {n\mu ,n\sigma } \right)\).

Therefore, both Y and\({X_i}\)are independent normal variables. Since Y is summation of\({X_i}\)variables n times, it is a linear combination of it.

This means that\({X_n}\)and\({X_i}\)are linear combinations of Y and\({X_i}\).

Now,

\(\begin{aligned}{}E\left( {{X_i}} \right)& = \mu \\E\left( {\overline {{X_n}} } \right)& = \mu \end{aligned}\)

Because sample mean is equal to population mean. Also by sample mean variance property:

\(\begin{aligned}{}Var\left( {{X_i}} \right) &= {\sigma ^2}\\Var\left( {\overline {{X_n}} } \right) &= \frac{{{\sigma ^2}}}{n}\end{aligned}\)

Therefore, \({X_i}\)and \(\overline {{X_n}} \) have the bivariate normal distribution with both means \(\mu \), variances\({\sigma ^2}{\rm{ }}and\,\,\frac{{{\sigma ^2}}}{n}\) , and correlation\(\frac{1}{{\sqrt n }}\).

03

To find the conditional distribution

To find the conditional bivariate normal distribution, let us assume X and Y follows a bivariate normal distribution. Then,

\(\begin{aligned}{}E\left[ {X|Y = y} \right) &= \frac{{\int\limits_{ - \infty }^\infty {x\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}\left[ {\frac{{{{\left( {x - {\mu _x}} \right)}^2}}}{{\sigma _x^2}} + \frac{{{{\left( {y - {\mu _y}} \right)}^2}}}{{\sigma _y^2}} - \frac{{2\rho \left( {x - {\mu _x}} \right)\left( {y - {\mu _y}} \right)}}{{{\sigma _X}{\sigma _Y}}}} \right]} \right)dx} }}{{\int\limits_{ - \infty }^\infty {\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}\left[ {\frac{{{{\left( {x - {\mu _x}} \right)}^2}}}{{\sigma _x^2}} + \frac{{{{\left( {y - {\mu _y}} \right)}^2}}}{{\sigma _y^2}} - \frac{{2\rho \left( {x - {\mu _x}} \right)\left( {y - {\mu _y}} \right)}}{{{\sigma _X}{\sigma _Y}}}} \right]} \right)dx} }}\\& = \frac{{\int\limits_{ - \infty }^\infty {x\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}\left[ {\frac{{{{\left( {x - {\mu _x}} \right)}^2}}}{{\sigma _x^2}} + \frac{{{p^2}{{\left( {y - {\mu _y}} \right)}^2}}}{{\sigma _y^2}} - \frac{{2\rho \left( {x - {\mu _x}} \right)\left( {y - {\mu _y}} \right)}}{{{\sigma _X}{\sigma _Y}}}} \right]} \right)dx} }}{{\int\limits_{ - \infty }^\infty {\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}\left[ {\frac{{{{\left( {x - {\mu _x}} \right)}^2}}}{{\sigma _x^2}} + \frac{{{p^2}{{\left( {y - {\mu _y}} \right)}^2}}}{{\sigma _y^2}} - \frac{{2\rho \left( {x - {\mu _x}} \right)\left( {y - {\mu _y}} \right)}}{{{\sigma _X}{\sigma _Y}}}} \right]} \right)dx} }}\\& = \frac{{\int\limits_{ - \infty }^\infty {x\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}{{\left[ {\frac{{\left( {x - {\mu _x}} \right)}}{{{\sigma _x}}} + \frac{{p\left( {y - {\mu _y}} \right)}}{{{\sigma _y}}}} \right]}^2}} \right)dx} }}{{\int\limits_{ - \infty }^\infty {\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}{{\left[ {\frac{{\left( {x - {\mu _x}} \right)}}{{{\sigma _x}}} + \frac{{p\left( {y - {\mu _y}} \right)}}{{{\sigma _y}}}} \right]}^2}} \right)dx} }}\\& = \frac{{\int\limits_{ - \infty }^\infty {\left( {x + {\mu _x} + {\sigma _x}\frac{{p\left( {y - {\mu _y}} \right)}}{{{\sigma _y}}}} \right)\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}\frac{{{x^2}}}{{\sigma _x^2}}} \right)dx} }}{{\int\limits_{ - \infty }^\infty {\exp \left( {\frac{{ - 1}}{{2\left( {1 - {\rho ^2}} \right)}}\frac{{{x^2}}}{{\sigma _x^2}}} \right)dx} }}\\& = {\mu _x} + {\sigma _x}\frac{{p\left( {y - {\mu _y}} \right)}}{{{\sigma _y}}}\end{aligned}\)

Therefore, by the property of conditional distribution of a bivariate distribution, we can conclude that

\(P\left( {{X_i}|\overline {{X_n}} = \overline {{x_n}} } \right) \sim N\left( {\overline {{x_n}} ,{\sigma ^2}\left( {1 - \frac{1}{n}} \right)} \right)\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that X has the normal distribution for which the mean is 1 and the variance is 4. Find the value of each of the following probabilities:

(a). \({\rm P}\left( {X \le 3} \right)\)

(b). \({\rm P}\left( {X > 1.5} \right)\)

(c). \({\rm P}\left( {X = 1} \right)\)

(d). \({\rm P}\left( {2 < X < 5} \right)\)

(e). \({\rm P}\left( {X \ge 0} \right)\)

(f). \({\rm P}\left( { - 1 < X < 0.5} \right)\)

(g). \({\rm P}\left( {\left| X \right| \le 2} \right)\)

(h). \({\rm P}\left( {1 \le - 2X + 3 \le 8} \right)\)

Suppose that on a certain examination in advanced mathematics, students from university A achieve scores normally distributed with a mean of 625 and a variance of 100, and students from university B achieve scores normally distributed with a mean of 600 and a variance of 150. If two students from university A and three students from university B take this examination, what is the probability that the average of the scores of the two students from university A will be greater than the average of the scores of the three students from university B? Hint: Determine the distribution of the difference between the two averages

Prove that the p.f. of the negative binomial distribution can be written in the following alternative form:

\[{\bf{f}}\left( {{\bf{x|r,p}}} \right){\bf{ = }}\left\{ \begin{array}{l}\left( \begin{array}{l}{\bf{ - r}}\\{\bf{x}}\end{array} \right){{\bf{p}}^{\bf{r}}}{\left( {{\bf{ - }}\left[ {{\bf{1 - p}}} \right]} \right)^{\bf{x}}}\;\;{\bf{for}}\,{\bf{x = 0,1,2}}...\\{\bf{0}}\;\;\;{\bf{otherwise}}{\bf{.}}\end{array} \right.\]Hint: Use Exercise 10 in Sec. 5.3.

Suppose that the two measurements from flea beetles in Example 5.10.2 have the bivariate normal distribution with\({{\bf{\mu }}_{{\bf{1}}{\rm{ }}}} = {\rm{ }}{\bf{201}},{{\bf{\mu }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{118}},{{\bf{\sigma }}_{\bf{1}}}{\rm{ }} = {\rm{ }}{\bf{15}}.{\bf{2}},{{\bf{\sigma }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{6}}.{\bf{6}},{\rm{ }}{\bf{and}}\,\,{\bf{\rho }} = {\rm{ }}{\bf{0}}.{\bf{64}}.\)Suppose that the same two measurements from a secondspecies also have the bivariate normal distribution with\({{\bf{\mu }}_{\bf{1}}}{\rm{ }} = {\rm{ }}{\bf{187}},{{\bf{\mu }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{131}},{{\bf{\sigma }}_{\bf{1}}}{\rm{ }} = {\rm{ }}{\bf{15}}.{\bf{2}},{{\bf{\sigma }}_{\bf{2}}}{\rm{ }} = {\rm{ }}{\bf{6}}.{\bf{6}},{\rm{ }}{\bf{and}}\,\,{\bf{\rho }} = {\rm{ }}{\bf{0}}.{\bf{64}}\). Let\(\left( {{{\bf{X}}_{\bf{1}}},{\rm{ }}{{\bf{X}}_{\bf{2}}}} \right)\) be a pair of measurements on a flea beetle from one of these two species. Let \({{\bf{a}}_{\bf{1}}},{\rm{ }}{{\bf{a}}_{\bf{2}}}\)be constants.

a. For each of the two species, find the mean and standard deviation of \({{\bf{a}}_{\bf{1}}}{{\bf{X}}_{\bf{1}}}{\rm{ }} + {\rm{ }}{{\bf{a}}_{\bf{2}}}{{\bf{X}}_{\bf{2}}}.\)(Note that the variances for the two species will be the same. How do you know that?)

b. Find \({{\bf{a}}_{\bf{1}}},{\rm{ }}{{\bf{a}}_{\bf{2}}}\) to maximize the ratio of the difference between the two means found in part (a) to the standarddeviation found in part (a). There is a sense inwhich this linear combination \({{\bf{a}}_{\bf{1}}}{{\bf{X}}_{\bf{1}}}{\rm{ }} + {\rm{ }}{{\bf{a}}_{\bf{2}}}{{\bf{X}}_{\bf{2}}}.\)does the best job of distinguishing the two species among allpossible linear combinations.

Let \({\bf{f}}\left( {{{\bf{x}}_{\bf{1}}}{\bf{,}}{{\bf{x}}_{\bf{2}}}} \right)\) denote the p.d.f. of the bivariate normaldistribution specified by Eq. (5.10.2). Show that the maximumvalue of \({\bf{f}}\left( {{{\bf{x}}_{\bf{1}}}{\bf{,}}{{\bf{x}}_{\bf{2}}}} \right)\) is attained at the point at which \({{\bf{x}}_{\bf{1}}} = {{\bf{\mu }}_{\bf{1}}}{\rm{ }}{\bf{and}}{\rm{ }}{{\bf{x}}_{\bf{2}}} = {{\bf{\mu }}_{\bf{2}}}.\)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.