Chapter 3: Problem 17
Find the uniform distribution of the continuous type on the interval \((b, c)\) that has the same mean and the same variance as those of a chi-square distribution with 8 degrees of freedom. That is, find \(b\) and \(c\).
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 3: Problem 17
Find the uniform distribution of the continuous type on the interval \((b, c)\) that has the same mean and the same variance as those of a chi-square distribution with 8 degrees of freedom. That is, find \(b\) and \(c\).
All the tools & learning materials you need for study success - in one app.
Get started for free
A certain job is completed in three steps in series. The means and standard deviations for the steps are (in minutes): $$ \begin{array}{ccc} \hline \text { Step } & \text { Mean } & \text { Standard Deviation } \\ \hline 1 & 17 & 2 \\ 2 & 13 & 1 \\ 3 & 13 & 2 \\ \hline \end{array} $$Assuming independent steps and normal distributions, compute the probability that the job will take less than 40 minutes to complete.
Determine the constant \(c\) so that \(f(x)=c x(3-x)^{4}, 0
. Let \(X\) have a conditional Burr distribution with fixed parameters \(\beta\) and \(\tau\), given parameter \(\alpha\). (a) If \(\alpha\) has the geometric pdf \(p(1-p)^{\alpha}, \alpha=0,1,2, \ldots\), show that the unconditional distribution of \(X\) is a Burr distribution. (b) If \(\alpha\) has the exponential pdf \(\beta^{-1} e^{-\alpha / \beta}, \alpha>0\), find the unconditional pdf of \(X\).
. Let $$ p\left(x_{1}, x_{2}\right)=\left(\begin{array}{l} x_{1} \\ x_{2} \end{array}\right)\left(\frac{1}{2}\right)^{x_{1}}\left(\frac{x_{1}}{15}\right), \begin{array}{r} x_{2}=0,1, \ldots, x_{1} \\ x_{1}=1,2,3,4,5 \end{array} $$ zero elsewhere, be the joint pmf of \(X_{1}\) and \(X_{2}\). Determine: (a) \(E\left(X_{2}\right)\). (b) \(u\left(x_{1}\right)=E\left(X_{2} \mid x_{1}\right)\). (c) \(E\left[u\left(X_{1}\right)\right]\). Compare the answers of Parts (a) and (c).
Readers may have encountered the multiple regression model in a previous course in statistics. We can briefly write it as follows. Suppose we have a vector of \(n\) observations \(\mathbf{Y}\) which has the distribution \(N_{n}\left(\mathbf{X} \boldsymbol{\beta}, \sigma^{2} \mathbf{I}\right)\), where \(\mathbf{X}\) is an \(n \times p\) matrix of known values, which has full column rank \(p\), and \(\beta\) is a \(p \times 1\) vector of unknown parameters. The least squares estimator of \(\boldsymbol{\beta}\) is $$ \widehat{\boldsymbol{\beta}}=\left(\mathbf{X}^{\prime} \mathbf{X}\right)^{-1} \mathbf{X}^{\prime} \mathbf{Y} $$
What do you think about this solution?
We value your feedback to improve our textbook solutions.