/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 1E Suppose that X has the t dis... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose thatXhas thetdistribution withmdegrees of freedom(m >2). Show that Var(X)=m/(m−2).

Hint:To evaluate\({\bf{E}}\left( {{{\bf{X}}^{\bf{2}}}} \right)\), restrict the integral to the positive half of the real line and change the variable fromxto

\({\bf{y = }}\frac{{\frac{{{{\bf{x}}^{\bf{2}}}}}{{\bf{m}}}}}{{{\bf{1 + }}\frac{{{{\bf{x}}^{\bf{2}}}}}{{\bf{m}}}}}\)

Compare the integral with the p.d.f. of a beta distribution. Alternatively, use Exercise 21 in Sec. 5.7.

Short Answer

Expert verified

\(Var\left( X \right) = \frac{m}{{m - 2}}\) . Proved

Step by step solution

01

Given information

A random variable X has the t- distribution with m degrees of freedom and m is greater than 2. So, the probability density function is,

\(f\left( x \right) = \frac{{\Gamma \left( {\frac{{m + 1}}{2}} \right)}}{{{{\left( {m\pi } \right)}^{\frac{1}{2}}}\Gamma \left( {\frac{m}{2}} \right)}}{\left( {1 + \frac{{{x^2}}}{m}} \right)^{ - \frac{{\left( {m + 1} \right)}}{2}}}\;for\; - \infty < x < \infty \) .

02

Evaluate the expectation of the random variable

As the variable X is from t-distribution, so,\(E\left( X \right) = 0\)

Let us consider the square of the expectation of the random variable\(E\left( {{X^2}} \right)\)

So,

\(\begin{align}E\left( {{X^2}} \right) &= c\int_{ - \infty }^\infty {{x^2}{{\left( {1 + \frac{{{x^2}}}{m}} \right)}^{\frac{{ - \left( {m + 1} \right)}}{2}}}dx} \\ &= 2c\int_{ - \infty }^\infty {{x^2}{{\left( {1 + \frac{{{x^2}}}{m}} \right)}^{\frac{{ - \left( {m + 1} \right)}}{2}}}dx} \end{align}\)

Where\(c = \frac{{\Gamma \left( {\frac{{m + 1}}{2}} \right)}}{{{{\left( {m\pi } \right)}^{\frac{1}{2}}}\Gamma \left( {\frac{m}{2}} \right)}}\)

Now, there is provided that,

\(\begin{align}y &= \frac{{\frac{{{x^2}}}{m}}}{{1 + \frac{{{x^2}}}{m}}}\\x &= {\left( {\frac{{my}}{{1 - y}}} \right)^{\frac{1}{2}}}\\\frac{{dx}}{{dy}} &= \frac{{\sqrt m }}{2}{y^{ - \frac{1}{2}}}{\left( {1 - y} \right)^{ - \frac{3}{2}}}\end{align}\)

Now by substituting the value of x and dx in \(E\left( {{X^2}} \right)\), we get,

\(\begin{align}E\left( {{X^2}} \right) &= 2c\int_{ - \infty }^\infty {{x^2}{{\left( {1 + \frac{{{x^2}}}{m}} \right)}^{\frac{{ - \left( {m + 1} \right)}}{2}}}dx} \\ &= \sqrt m c\int_0^1 {\frac{{my}}{{\left( {1 - y} \right)}}{{\left( {1 + \frac{y}{{1 - y}}} \right)}^{\frac{{ - \left( {m + 1} \right)}}{2}}}{y^{ - \frac{1}{2}}}{{\left( {1 - y} \right)}^{ - \frac{3}{2}}}dy} \\ &= {m^{\frac{3}{2}}}c\frac{{\Gamma \left( {\frac{3}{2}} \right)\Gamma \left( {\frac{{\left( {m - 2} \right)}}{2}} \right)}}{{\Gamma \left( {\frac{{\left( {m + 1} \right)}}{2}} \right)}}\\ &= m{\pi ^{ - \frac{1}{2}}}\Gamma \left( {\frac{3}{2}} \right)\frac{{\Gamma \left( {\frac{{\left( {m - 2} \right)}}{2}} \right)}}{{\Gamma \left( {\frac{m}{2}} \right)}}\\ &= m{\pi ^{ - \frac{1}{2}}}\left( {\frac{1}{2}\sqrt \pi } \right)\frac{1}{{\left( {\frac{{\left( {m - 2} \right)}}{2}} \right)}}\\ &= \frac{m}{{m - 2}}\end{align}\)

03

Calculate the variance

The variance of the variable is,

\(\begin{align}Var\left( X \right) &= E\left( {{X^2}} \right) - E\left( X \right)\\ &= \frac{m}{{m - 2}} - 0\\ &= \frac{m}{{m - 2}}\end{align}\)

Therefore, \(Var\left( X \right) = \frac{m}{{m - 2}}\) . Proved.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that a random sampleX1, . . . , Xnis to be taken from the uniform distribution on the interval (0, θ) and thatθis unknown. How large must a random sample be taken in order\({\bf{P}}\left( {{\bf{|max}}\left\{ {{{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}} \right\}{\bf{ - \theta |}} \le {\bf{0}}{\bf{.10}}} \right) \ge {\bf{0}}{\bf{.95}}\) for all possibleθ?

Question:Suppose that\({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from a distribution for which the p.d.f. or the p.f. is f (x|θ ), where the value of the parameter θ is unknown. Let\({\bf{X = }}\left( {{{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}} \right)\)and let T be a statistic. Assuming that δ(X) is an unbiased estimator of θ, it does not depend on θ. (If T is a sufficient statistic defined in Sec. 7.7, then this will be true for every estimator δ. The condition also holds in other examples.) Let\({{\bf{\delta }}_{\bf{0}}}\left( {\bf{T}} \right)\)denote the conditional mean of δ(X) given T.

a. Show that\({{\bf{\delta }}_{\bf{0}}}\left( {\bf{T}} \right)\)is also an unbiased estimator of θ.

b. Show that\({\bf{Va}}{{\bf{r}}_{\bf{\theta }}}\left( {{{\bf{\delta }}_{\bf{0}}}} \right) \le {\bf{Va}}{{\bf{r}}_{\bf{\theta }}}\left( {\bf{\delta }} \right)\)for every possible value of θ. Hint: Use the result of Exercise 11 in Sec. 4.7.

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, \ldots ,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from the normal distribution with unknown mean μand unknown variance \({{\bf{\sigma }}^{\bf{2}}}\), and let the random variableLdenote the length of the shortest confidence interval forμthat can be constructed from the observed values in the sample. Find the value of \({\bf{E}}\left( {{{\bf{L}}^{\bf{2}}}} \right)\)for the following values of the sample sizenand the confidence coefficient\(\gamma \):

\(\begin{align}{\bf{a}}{\bf{.n = 5,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{b}}{\bf{.n = 10,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{c}}{\bf{.n = 30,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{d}}{\bf{.n = 8,}}\gamma {\bf{ = 0}}{\bf{.90}}\\{\bf{e}}{\bf{.n = 8,}}\gamma {\bf{ = 0}}{\bf{.95}}\\{\bf{f}}{\bf{.n = 8,}}\gamma {\bf{ = 0}}{\bf{.99}}\end{align}\)

Suppose that a point(X, Y, Z)is to be chosen at random in three-dimensional space, whereX,Y, andZare independent random variables, and each has the standard normal distribution. What is the probability that the distance from the origin to the point will be less than 1 unit?

Question:Suppose that a random variable X has a normal distribution for which the mean μ is unknown (−∞ <μ< ∞) and the variance σ2 is known. Let\({\bf{f}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)\)denote the p.d.f. of X, and let\({\bf{f'}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)\)and\({\bf{f''}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)\)denote the first and second partial derivatives with respect to μ. Show that

\(\int_{{\bf{ - }}\infty }^\infty {{\bf{f'}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)} {\bf{dx = 0}}\,\,{\bf{and}}\,\,\int_{{\bf{ - }}\infty }^\infty {{\bf{f''}}\left( {{\bf{x}}\left| {\bf{\mu }} \right.} \right)} {\bf{dx = 0}}{\bf{.}}\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.