/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 6E Assume that the random variables... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Assume that the random variables\({X_1},...{X_n}\)form a random sample of size n from the distribution specified in that exercise, and show that the statistic T specified in the exercise is a sufficient statistic for the parameter.

6. The gamma distribution with parameters\(\alpha \)and\(\beta \)where the value of\(\beta \)is known and the value of\(\alpha \)is unknown\(\left( {\alpha > 0} \right)\);\(T = \prod\limits_{i = 1}^n {{X_i}} \).

Short Answer

Expert verified

The statistic \(T = \prod\limits_{i = 1}^n {{X_i}} \) is sufficient statistic

Step by step solution

01

Defining the sufficient estimator 

Let \({X_1},...{X_n}\) be the random sample of size n from the specified distribution with density function \(f\left( {x,\theta } \right)\) where parameter \(\theta \).is unknown.

An estimator \(T = T\left( {{X_1},...{X_n}} \right)\) is said to be the sufficient estimator of \(\theta \) If conditional joint distribution\({X_1},...{X_n}\)given any value t of estimator is independent of \(\theta \)

02

Defining the factorization theorem 

Let \({X_1},...{X_n}\) be the random sample of size n from the specified distribution with density function \(f\left( {x,\theta } \right)\), here \(\theta \) is unknown An estimator \(T = T\left( {{X_1},...{X_n}} \right)\) is said to be the sufficient estimator of \(\theta \) if \(L\left( {x,\theta } \right) = g\left( {t,\theta } \right)h\left( x \right)\)

Here\(L\left( {x,\theta } \right)\)is the likelihood function of\({X_1},...{X_n}\).

\(g\left( {t,\theta } \right)\)is the function of\({X_1},...{X_n}\)which depends on\(\theta \)

\(h\left( x \right)\) is the s function which is independent on \(\theta \).

03

Verifying statistic T is a sufficient statistic.

Let a continuous random variable x has the gamma distribution with parameter \(\alpha > 0\) and \(\beta > 0\)

Pdf of gamma distribution is given as:

\(f\left( x \right) = \frac{{{\beta ^\alpha }}}{{\left| {{\alpha \,}} \right. }}{x^{\alpha - 1}}{e^{ - \beta x}}\)

Using factorization theorem

\(L\left( p \right) = {\rm P}\left( {{X_1} = {x_1},{X_2} = {x_2}...{X_n} = {x_n}} \right)\)

\(\begin{align}L\left( p \right) &= \prod\limits_{i = 1}^n {\frac{{{\beta ^\alpha }}}{{\left ( {{\alpha \,}} \right. }}{x^{\alpha - 1}}{e^{ - \beta x}}} \\ &= \left\{ {\frac{{{\beta ^{n\alpha }}}}{{{{\left( {\left ( {{\alpha \,}} \right. } \right)}^n}}}{{\left( {\prod\limits_{i = 1}^n {{x_i}} } \right)}^{\alpha - 1}}} \right\}\left\{ {{e^{ - \beta \sum\limits_{i = 1}^n {{x_i}} }}} \right\}\end{align}\)

One can observe that the first bracket is depending on\(\alpha \).

Hence by using factorization theorem, one can say thatThe statistic\(T = \prod\limits_{i = 1}^n {{X_i}} \)is sufficient statistic for the parameter\(\alpha \).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the gamma distribution specified in Exercise 6. Show that the statistic \({\bf{T = }}\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{\bf{log}}{{\bf{x}}_{\bf{i}}}} \) is a sufficient statistic for the parameter\({\bf{\alpha }}\).

Suppose that \({X_1}...{X_n}\) form a random sample from a distribution for which the p.d.f. is \(f\left( {x|\theta } \right)\), the value of \(\theta \) is unknown, and the prior p.d.f. of\(\theta \) is. Show that the posterior p.d.f. \(\xi \left( \theta \right)\) is the same regardless of whether it is calculated directly by using Eq. (7.2.7) or sequentially by using Eqs. (7.2.14), (7.2.15), and (7.2.16).

Suppose that the heights of the individuals in a certain population have a normal distribution for which the valueof the mean θis unknown and the standard deviation is2 inches. Suppose also that the prior distribution ofθis anormal distribution for which the mean is 68 inches andthe standard deviation is 1 inch. If 10 people are selectedat random from the population, and their average height is found to be 69.5 inches, what is the posterior distributionofθ?

Suppose that \({X_1},...,{X_n}\) form a random sample from an exponential distribution for which the value of the parameter β is unknown (β > 0). Is the M.L.E. of β a minimal sufficient statistic.

Question: Let \({{\bf{x}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{x}}_{\bf{n}}}\) be distinct numbers. Let Y be a discrete random variable with the following p.f.:

\(\begin{array}{c}{\bf{f}}\left( {\bf{y}} \right){\bf{ = }}\frac{{\bf{1}}}{{\bf{n}}}\,{\bf{if}}\,{\bf{y}} \in \left\{ {{{\bf{x}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{x}}_{\bf{n}}}} \right\}\\ = 0\,otherwise\end{array}\)

Prove that Var(Y ) is given by Eq. (7.5.5).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.