Chapter 9: Problem 5
Let \(X_{1}\) and \(X_{2}\) be two independent random variables. Let \(X_{1}\) and
\(Y=\) \(X_{1}+X_{2}\) be \(\chi^{2}\left(r_{1}, \theta_{1}\right)\) and
\(\chi^{2}(r, \theta)\), respectively. Here \(r_{1}
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 9: Problem 5
Let \(X_{1}\) and \(X_{2}\) be two independent random variables. Let \(X_{1}\) and
\(Y=\) \(X_{1}+X_{2}\) be \(\chi^{2}\left(r_{1}, \theta_{1}\right)\) and
\(\chi^{2}(r, \theta)\), respectively. Here \(r_{1}
All the tools & learning materials you need for study success - in one app.
Get started for free
Let \(\mathbf{A}=\left[a_{i j}\right]\) be a real symmetric matrix. Prove that \(\sum_{i} \sum_{j} a_{i j}^{2}\) is equal to the sum of the squares of the eigenvalues of \(\mathbf{A}\). Hint: If \(\boldsymbol{\Gamma}\) is an orthogonal matrix, show that \(\sum_{j} \sum_{i} a_{i j}^{2}=\operatorname{tr}\left(\mathbf{A}^{2}\right)=\operatorname{tr}\left(\mathbf{\Gamma}^{\prime} \mathbf{A}^{2} \mathbf{\Gamma}\right)=\) \(\operatorname{tr}\left[\left(\mathbf{\Gamma}^{\prime} \mathbf{A} \mathbf{\Gamma}\right)\left(\mathbf{\Gamma}^{\prime} \mathbf{A} \boldsymbol{\Gamma}\right)\right]\)
If \(A_{1}, A_{2}, \ldots, A_{k}\) are events, prove, by induction, Boole's inequality $$ P\left(A_{1} \cup A_{2} \cup \cdots \cup A_{k}\right) \leq \sum_{1}^{k} P\left(A_{i}\right) $$ Then show that $$ P\left(A_{1}^{c} \cap A_{2}^{c} \cap \cdots \cap A_{k}^{c}\right) \geq 1-\sum_{1}^{b} P\left(A_{i}\right) $$
Let \(Q=X_{1} X_{2}-X_{3} X_{4}\), where \(X_{1}, X_{2}, X_{3}, X_{4}\) is a random sample of size 4 from a distribution which is \(N\left(0, \sigma^{2}\right) .\) Show that \(Q / \sigma^{2}\) does not have a chi-square distribution. Find the mgf of \(Q / \sigma^{2}\).
Assume that the sample \(\left(x_{1}, Y_{1}\right), \ldots,\left(x_{n},
Y_{n}\right)\) follows the linear model \((9.6 .1)\). Suppose \(Y_{0}\) is a future
observation at \(x=x_{0}-\bar{x}\) and we want to determine a predictive
interval for it. Assume that the model \((9.6 .1)\) holds for \(Y_{0}\); i.e.,
\(Y_{0}\) has a \(N\left(\alpha+\beta\left(x_{0}-\bar{x}\right),
\sigma^{2}\right)\) distribution. We will use \(\hat{\eta}_{0}\) of Exercise \(9.6
.4\) as our prediction of \(Y_{0}\)
(a) Obtain the distribution of \(Y_{0}-\hat{\eta}_{0}\). Use the fact that the
future observation \(Y_{0}\) is independent of the sample \(\left(x_{1},
Y_{1}\right), \ldots,\left(x_{n}, Y_{n}\right)\)
(b) Determine a \(t\) -statistic with numerator \(Y_{0}-\hat{\eta}_{0}\).
(c) Now beginning with \(1-\alpha=P\left[-t_{\alpha / 2, n-2}
The driver of a diesel-powered automobile decided to test the quality of three types of diesel fuel sold in the area based on mpg. Test the null hypothesis that the three means are equal using the following data. Make the usual assumptions and take \(\alpha=0.05\). $$ \begin{array}{llllll} \text { Brand A: } & 38.7 & 39.2 & 40.1 & 38.9 & \\ \text { Brand B: } & 41.9 & 42.3 & 41.3 & & \\ \text { Brand C: } & 40.8 & 41.2 & 39.5 & 38.9 & 40.3 \end{array} $$
What do you think about this solution?
We value your feedback to improve our textbook solutions.