/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 1E A gamma distribution for which b... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

A gamma distribution for which both parameters α and β are unknown (α > 0 and β > 0); \({T_1} = \prod\limits_{i = 1}^n {{X_i}} \)and \({T_2} = \sum\limits_{i = 1}^n {{x_i}} \)

Short Answer

Expert verified

\({T_1} = \prod\limits_{i = 1}^n {{X_i}} \) and \({T_2} = \sum\limits_{i = 1}^n {{x_i}} \) are jointly sufficient statistics.

Step by step solution

01

Given information

A gamma distribution for which both parameters α and β are unknown (α > 0 and β > 0), one need to check if \({T_1} = \prod\limits_{i = 1}^n {{X_i}} \) and \({T_2} = \sum\limits_{i = 1}^n {{x_i}} \)are jointly sufficient statistics.

02

Checking if \({T_1} = \prod\limits_{i = 1}^n {{X_i}} \)  and \({T_2} = \sum\limits_{i = 1}^n {{x_i}} \)are jointly sufficient statistics 

By Fisher-Neyman factorization theorem,

Suppose \({X_1},...,{X_n}\) be i.i.d. random samples, with pdf \(f\left( {x;\theta } \right)\) and let \(T = r\left( {{X_1},...,{X_n}} \right)\) be a statistic T is sufficient statistic for \(\theta \)iff,

\(f\left( {x;\theta } \right) = u\left( x \right)\nu \left( {r\left( x \right);\theta } \right)\)where, \(u{\rm{ }}and{\rm{ }}\nu \) are non-negative functions.

The pdf of the gamma distribution is

\(f\left( {x;\alpha ,\beta } \right) = \frac{1}{{\Gamma \left( \alpha \right)}}{\beta ^\alpha }{x^{\alpha - 1}}{e^{ - \beta x}};x \ge 0,\alpha > 0,\beta > 0\).

The joint pdf of the gamma distribution with unknown parameters \(\alpha \,\,\,\,{\rm{and}}\,\,\,\,\beta \) is

\(\begin{align}f\left( {x;\alpha ,\beta } \right) &= \prod\limits_{i = 1}^n {\left( {\frac{1}{{\Gamma \left( \alpha \right)}}{\beta ^\alpha }{x^{\alpha - 1}}{e^{ - \beta x}}} \right)} \\ &= \frac{{{\beta ^{n\alpha }}}}{{{{\left( {\Gamma \left( \alpha \right)} \right)}^n}}}{\left( {\prod\limits_{i = 1}^n {{x_i}} } \right)^{\alpha - 1}}{e^{ - \beta \sum\limits_{i = 1}^n {{x_i}} }}\\ &= \frac{{{\beta ^{n\alpha }}}}{{{{\left( {\Gamma \left( \alpha \right)} \right)}^n}}}{\left( {{t_1}} \right)^{\alpha - 1}}{e^{ - \beta \sum\limits_{i = 1}^n {{x_i}} }}\end{align}\)

Now by Fisher-Neyman factorization theorem,

\(\nu \left( {{r_1}\left( x \right),{r_2}\left( x \right);\alpha ,\beta } \right] = \frac{1}{{{{\left ( {\left( \alpha \right) \}} \right. }^n}}}t_1^{\alpha - 1}{\beta ^{n\alpha }}{e^{ - \beta {t_2}}}\) and \(u\left( x \right) = 1\)

It follows that \({T_1} = \prod\limits_{i = 1}^n {{x_i}} \) and \({T_2} = \sum\limits_{i = 1}^n {{x_i}} \)are jointly sufficient statistics.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Consider again the problem described in Exercise 6, and assume the same prior distribution of θ. Suppose now, however, that instead of selecting a random sample of eight items from the lot, we perform the following experiment: Items from the lot are selected at random one by one until exactly three defectives have been found. If we find that we must select a total of eight items in this experiment, what is the posterior distribution of θ at the end of the experiment?

Question: Let\({\bf{X1, }}{\bf{. }}{\bf{. }}{\bf{. , Xn}}\) be a random sample from the uniform distribution on the interval \(\left( {{\bf{0,\theta }}} \right)\)

a. Find the method of moments estimator of \({\bf{\theta }}\).

b. Show that the method of moments estimator is not the M.L.E.

Suppose that the proportion \(\theta \) of defective items in a large manufactured lot is unknown, and the prior distribution of \(\theta \) is the uniform distribution on the interval \(\left[ {0,1} \right]\). When eight items are selected at random from the lot, it is found that exactly three of them are defective. Determine the posterior distribution of \(\theta \).

Question: Consider again the conditions in Exercise 2, but suppose also that it is known that\(\)\(\frac{{\bf{1}}}{{\bf{2}}} \le {\bf{p}} \le \frac{{\bf{2}}}{{\bf{3}}}\). If the observations in the random sample of 70 purchases are as given in Exercise 2, what is the M.L.E. of p?

Identify the components of the statistical model (as defined in Definition 7.1.1) in Example 7.1.3.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.