/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 9 Here \(Q_{1}\) and \(Q_{2}\) are... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Here \(Q_{1}\) and \(Q_{2}\) are quadratic forms in observations of a random sample from \(N(0,1)\). If \(Q_{1}\) and \(Q_{2}\) are independent and if \(Q_{1}+Q_{2}\) has a chi-square distribution, prove that \(Q_{1}\) and \(Q_{2}\) are chi-square variables.

Short Answer

Expert verified
Based on the properties of the Chi-square distribution and the fact that \(Q_{1} + Q_{2}\) follows this distribution, \(Q_{1}\) and \(Q_{2}\) are both Chi-square variables and they are independent of each other.

Step by step solution

01

Understanding Chi-square Distribution

Firstly, a Chi-square distribution Chi-square(k) is actually a distribution of a sum of squares of k independent variables following a standard normal distribution.
02

Denoting \(Q_{1}\) and \(Q_{2}\) as Quadratic Forms

Denote \(Q_1 = X_1^2 + X_2^2 + ... + X_k^2\) and \(Q_2 = Y_1^2 + Y_2^2 + ... + Y_m^2\), where \(X_i\)s and \(Y_i\)s are independently and identically distributed normal variables, \(X_i, Y_i \sim N(0,1)\). When you sum them, you get \(Q_1 + Q_2 = X_1^2 + X_2^2 + ... + X_k^2 + Y_1^2 + Y_2^2 + ... + Y_m^2\), a sum of squares of \(k+m\) independent standard Normal variables.
03

Verifying Chi-square Distribution

The given condition states that \(Q_1 + Q_2\) follows a Chi-square distribution, therefore the sum of squares of standard normal variables are independently distributed. Hence, using the formula for Chi-square distribution, variables \(Q_{1}\) and \(Q_{2}\) also follow the Chi-square distribution.
04

Understanding the Independence of \(Q_{1}\) and \(Q_{2}\)

Now, since \(Q_{1}\) and \(Q_{2}\) are derived from separate independent sets of standard normal variables, this fact also implies that \(Q_{1}\) and \(Q_{2}\) are independent Chi-square variables, because any function of independent variables is also independent.
05

Conclusion

In conclusion, when \(Q_{1} + Q_{2}\) follows a Chi-square distribution, both \(Q_{1}\) and \(Q_{2}\) are Chi-square variables and are independent of each other.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Using the notation of Section 9.2, assume that the means \(\mu_{j}\) satisfy a linear function of \(j\), namely, \(\mu_{j}=c+d[j-(b+1) / 2] .\) Let independent random samples of size \(a\) be taken from the \(b\) normal distributions having means \(\mu_{1}, \mu_{2}, \ldots, \mu_{b}\), respectively, and common unknown variance \(\sigma^{2}\). (a) Show that the maximum likelihood estimators of \(c\) and \(d\) are, respectively, \(\hat{c}=\bar{X}_{. .}\) and $$ \hat{d}=\frac{\sum_{j=1}^{b}[j-(b-1) / 2]\left(\bar{X}_{. j}-\bar{X}_{. .}\right)}{\sum_{j=1}^{b}[j-(b+1) / 2]^{2}} $$ (b) Show that $$ \begin{aligned} \sum_{i=1}^{a} \sum_{j=1}^{b}\left(X_{i j}-\bar{X}_{. .}\right)^{2}=& \sum_{i=1}^{a} \sum_{j=1}^{b}\left[X_{i j}-\bar{X}_{. .}-\hat{d}\left(j-\frac{b+1}{2}\right)\right]^{2} \\ &+\hat{d}^{2} \sum_{j=1}^{b} a\left(j-\frac{b+1}{2}\right)^{2} \end{aligned} $$ (c) Argue that the two terms in the right-hand member of part (b), once divided by \(\sigma^{2}\), are independent random variables with \(\chi^{2}\) distributions provided that \(d=0\). (d) What \(F\) -statistic would be used to test the equality of the means, that is, \(H_{0}: d=0 ?\)

Show that the square of a noncentral \(T\) random variable is a noncentral \(F\) random variable.

Let the \(4 \times 1\) matrix \(\boldsymbol{Y}\) be multivariate normal \(N\left(\boldsymbol{X} \boldsymbol{\beta}, \sigma^{2} \boldsymbol{I}\right)\), where the \(4 \times 3\) matrix \(\boldsymbol{X}\) equals $$ \boldsymbol{X}=\left[\begin{array}{rrr} 1 & 1 & 2 \\ 1 & -1 & 2 \\ 1 & 0 & -3 \\ 1 & 0 & -1 \end{array}\right] $$ and \(\boldsymbol{\beta}\) is the \(3 \times 1\) regression coefficient matrix. (a) Find the mean matrix and the covariance matrix of \(\hat{\boldsymbol{\beta}}=\left(\boldsymbol{X}^{\prime} \boldsymbol{X}\right)^{-1} \boldsymbol{X}^{\prime} \boldsymbol{Y}\). (b) If we observe \(\boldsymbol{Y}^{\prime}\) to be equal to \((6,1,11,3)\), compute \(\hat{\boldsymbol{\beta}}\).

The driver of a diesel-powered automobile decided to test the quality of three types of diesel fuel sold in the area based on mpg. Test the null hypothesis that the three means are equal using the following data. Make the usual assumptions and take \(\alpha=0.05\). \(\begin{array}{llllll}\text { Brand A: } & 38.7 & 39.2 & 40.1 & 38.9 & \\ \text { Brand B: } & 41.9 & 42.3 & 41.3 & & \\\ \text { Brand C: } & 40.8 & 41.2 & 39.5 & 38.9 & 40.3\end{array}\)

If \(A_{1}, A_{2}, \ldots, A_{k}\) are events, prove, by induction, Boole's inequality $$ P\left(A_{1} \cup A_{2} \cup \cdots \cup A_{k}\right) \leq \sum_{1}^{k} P\left(A_{i}\right) $$ Then show that $$ P\left(A_{1}^{c} \cap A_{2}^{c} \cap \cdots \cap A_{k}^{c}\right) \geq 1-\sum_{1}^{b} P\left(A_{i}\right) $$

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.