/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 5E Suppose that the vectors \(\left... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that the vectors \(\left( {{X_1},{Y_1}} \right),\left( {{X_2},{Y_2}} \right),...,\left( {{X_n},{Y_n}} \right)\) form a random sample of two-dimensional vectors from a bivariate normal distribution for which the means, the variances, and the correlation are unknown. Show that the following five statistics are jointly sufficient:

\(\sum\limits_{i = 1}^n {{X_i}} ,\sum\limits_{i = 1}^n {{Y_i}} ,\sum\limits_{i = 1}^n {X_i^2} ,\sum\limits_{i = 1}^n {Y_i^2} \,\,\,\,{\rm{and}}\,\,\,\,\sum\limits_{i = 1}^n {{X_i}{Y_i}} \)

Short Answer

Expert verified

\(\sum\limits_{i = 1}^n {{X_i}} ,\sum\limits_{i = 1}^n {{Y_i}} ,\sum\limits_{i = 1}^n {X_i^2} ,\sum\limits_{i = 1}^n {Y_i^2} \,\,\,\,{\rm{and}}\,\,\,\,\sum\limits_{i = 1}^n {{X_i}{Y_i}} \)are jointly sufficient statistics

Step by step solution

01

Given information

The vectors \(\left( {{X_1},{Y_1}} \right),\left( {{X_2},{Y_2}} \right),...,\left( {{X_n},{Y_n}} \right)\) form a random sample of two-dimensional vectors from a bivariate normal distribution. We need to prove that the following

\(\sum\limits_{i = 1}^n {{X_i}} ,\sum\limits_{i = 1}^n {{Y_i}} ,\sum\limits_{i = 1}^n {X_i^2} ,\sum\limits_{i = 1}^n {Y_i^2} \,\,\,\,{\rm{and}}\,\,\,\,\sum\limits_{i = 1}^n {{X_i}{Y_i}} \)are jointly sufficient statistics

02

Proof of \(\sum\limits_{i = 1}^n {{X_i}} ,\sum\limits_{i = 1}^n {{Y_i}} ,\sum\limits_{i = 1}^n {X_i^2} ,\sum\limits_{i = 1}^n {Y_i^2} \,\,\,\,{\rm{and}}\,\,\,\,\sum\limits_{i = 1}^n {{X_i}{Y_i}} \)are jointly sufficient statistics

Fisher-Neyman Factorization Theorem

\({X_1},...,{X_n}\) be i.i.dr.v. with pdf \(f\left( {x;\theta } \right)\) and let \(T = r\left( {{X_1},...,{X_n}} \right)\) be a statistic .T is sufficient statistic for \(\theta \)iff

\(f\left( {x;\theta } \right) = u\left( x \right)\nu \left( {r\left( x \right);\theta } \right)\) where \(u\,\,\,{\rm{and}}\,\,\,\nu \) are non-negative functions.

The joint pdf of the bivariate normal distribution with unknown parameters \({\mu _1},{\mu _2},{\sigma _1},{\sigma _2},\rho \) is

\(f\left( {x,y} \right) = \frac{1}{{2\pi \sqrt {1 - {\rho ^2}} {\sigma _1}{\sigma _2}}}\exp \left\{ { - \frac{1}{{2\left( {1 - {\rho ^2}} \right)}}\left( {\frac{{{{\left( {x - \mu } \right)}^2}}}{{{\sigma _1}^2}} - 2\rho \frac{{\left( {x - \mu } \right)}}{{{\sigma _1}}}\frac{{\left( {y - \mu } \right)}}{{{\sigma _2}}} + \frac{{{{\left( {y - \mu } \right)}^2}}}{{{\sigma _2}^2}}} \right)} \right\}\)

The above joint distribution is

\(\begin{align}\prod {f\left( {x,y} \right) = } {\left( {\frac{1}{{2\pi {\sigma _1}{\sigma _2}\sqrt {1 - {\rho ^2}} }}} \right)^n}\exp \left( { - \frac{1}{{2\left( {1 - {\rho ^2}} \right)}}\left( {\frac{{\sum\limits_{i = 1}^n {{{\left( {{x_i} - {\mu _1}} \right)}^2}} }}{{\sigma _1^2}} - 2\rho \frac{{\sum\limits_{i = 1}^n {\left( {{x_i} - {\mu _1}} \right)} }}{{{\sigma _1}}}\frac{{\sum\limits_{i = 1}^n {\left( {{y_i} - {\mu _2}} \right)} }}{{{\sigma _2}}} + \frac{{\sum\limits_{i = 1}^n {{{\left( {{y_i} - {\mu _2}} \right)}^2}} }}{{\sigma _2^2}}} \right)} \right)\\ = {\left( {\frac{1}{{2\pi {\sigma _1}{\sigma _2}\sqrt {1 - {\rho ^2}} }}} \right)^n}\exp \left( { - \frac{1}{{2\left( {1 - {\rho ^2}} \right)}}\left( {\frac{{\sum\limits_{i = 1}^n {x_i^2} - 2{\mu _1}\sum\limits_{i = 1}^n {{x_i}} + \mu _1^2}}{{\sigma _1^2}} - 2\rho \frac{{\sum\limits_{i = 1}^n {{x_i}{y_i}} - {\mu _1}\sum\limits_{i = 1}^n {{y_i}} - {\mu _2}\sum\limits_{i = 1}^n {{x_i}} + {\mu _1}{\mu _2}}}{{{\sigma _1}{\sigma _2}}} + \frac{{\sum\limits_{i = 1}^n {y_i^2} - 2{\mu _2}\sum\limits_{i = 1}^n {{y_i}} + \mu _2^2}}{{\sigma _2^2}}} \right)} \right)\\ = {\left( {\frac{1}{{2\pi {\sigma _1}{\sigma _2}\sqrt {1 - {\rho ^2}} }}} \right)^n}\exp \left( { - \frac{1}{{2\left( {1 - {\rho ^2}} \right)}}\left( {\frac{{\sum\limits_{i = 1}^n {x_i^2} }}{{\sigma _1^2}} - \frac{{2{\mu _1}\sum\limits_{i = 1}^n {{x_i}} }}{{\sigma _1^2}} + \frac{{\mu _1^2}}{{\sigma _1^2}} - 2\rho \left( {\frac{{\sum\limits_{i = 1}^n {{x_i}{y_i}} }}{{{\sigma _1}{\sigma _2}}} - \frac{{{\mu _1}\sum\limits_{i = 1}^n {{y_i}} }}{{{\sigma _1}{\sigma _2}}} - \frac{{{\mu _2}\sum\limits_{i = 1}^n {{x_i}} }}{{{\sigma _1}{\sigma _2}}} + \frac{{{\mu _1}\mu }}{{{\sigma _1}{\sigma _2}}}} \right) + \frac{{\sum\limits_{i = 1}^n {y_i^2} }}{{\sigma _2^2}} - \frac{{2{\mu _2}\sum\limits_{i = 1}^n {{y_i}} }}{{\sigma _2^2}} + \frac{{\mu _2^2}}{{\sigma _2^2}}} \right)} \right)\end{align}\)

From the statistic it is clear that it is sufficient to prove that the sums depend only on the following five statistics.

\(\sum\limits_{i = 1}^n {{{\left( {{x_i} - {\mu _1}} \right)}^2}} ,\sum\limits_{i = 1}^n {{{\left( {{y_i} - {\mu _2}} \right)}^2},} \sum\limits_{i = 1}^n {\left( {{x_i} - {\mu _1}} \right)\left( {{y_i} - {\mu _2}} \right)} \)

This is because

\(\begin{align}{\sum\limits_{i = 1}^n {\left( {{x_i} - {\mu _1}} \right)} ^2} &= \sum\limits_{i = 1}^n {{x_i}^2 - 2{\mu _1}} \sum\limits_{i = 1}^n {{x_i} + n} {\mu _1}^2\\{\sum\limits_{i = 1}^n {\left( {{y_i} - {\mu _2}} \right)} ^2} &= \sum\limits_{i = 1}^n {{y_i}^2 - 2{\mu _2}} \sum\limits_{i = 1}^n {{y_i} + n} {\mu _2}^2\\\sum\limits_{i = 1}^n {\left( {{x_i} - {\mu _1}} \right)} \left( {{y_i} - {\mu _2}} \right) &= \sum\limits_{i = 1}^n {{x_i}{y_i}} - {\mu _2}\sum\limits_{i = 1}^n {{x_i}} - {\mu _1}\sum\limits_{i = 1}^n {{y_i} + n{\mu _1}{\mu _2}} \end{align}\)

Now, it is obvious that

\(\sum\limits_{i = 1}^n {{X_i}} ,\sum\limits_{i = 1}^n {{Y_i}} ,\sum\limits_{i = 1}^n {X_i^2} ,\sum\limits_{i = 1}^n {Y_i^2} \,\,\,\,{\rm{and}}\,\,\,\,\sum\limits_{i = 1}^n {{X_i}{Y_i}} \)are jointly sufficient statistic.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question: Prove that the method of moments estimator of the mean of a Poisson distribution is the M.L.E.

Suppose that the number of defects in a 1200-foot roll of magnetic recording tape has a Poisson distribution for which the value of the mean θ is unknown, and the prior distribution of θ is the gamma distribution with parameters \(\alpha = 3\) and \(\beta = 1\). When five rolls of this tape are selected at random and inspected, the numbers of defects found on the rolls are 2, 2, 6, 0, and 3. If the squared error loss function is used, what is the Bayes estimate of θ?

Suppose that the proportion θ of defective items in a large shipment is unknown and that the prior distribution of θ is the beta distribution with parameters 2 and 200. If 100 items are selected at random from the shipment and if three of these items are found to be defective, what is the posterior distribution of θ?

Question: Suppose that a scientist desires to estimate the proportionp of monarch butterflies that have a special typeof marking on their wings.

a. Suppose that he captures monarch butterflies one ata time until he has found five that have this specialmarking. If he must capture a total of 43 butterflies,what is the M.L.E. of p?

b. Suppose that at the end of a day the scientist hadcaptured 58 monarch butterflies and had found onlythree with the special marking. What is the M.L.E.of p?

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the gamma distribution specified in Exercise 6. Show that the statistic \({\bf{T = }}\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{\bf{log}}{{\bf{x}}_{\bf{i}}}} \) is a sufficient statistic for the parameter\({\bf{\alpha }}\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.