/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 15E Suppose that \({{\bf{X}}_{\bf{1}... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the beta distribution with parameters α and β, where the value of α is known and the value of β is unknown (β > 0). Show that the following statistic T is a sufficient statistic for β

\({\bf{T = }}\frac{{\bf{1}}}{{\bf{n}}}\left( {\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{\bf{log}}\frac{{\bf{1}}}{{{\bf{1 - }}{{\bf{X}}_{\bf{i}}}}}} } \right)\)

Short Answer

Expert verified

\(T = \frac{1}{n}\left( {\sum\limits_{i = 1}^n {\log \frac{1}{{1 - {X_i}}}} } \right)\)is a sufficient statistic for \(\beta \)

Step by step solution

01

Given information

\({X_1},...,{X_n}\) form a random sample from the beta distribution with parameters α and β.

We need to prove that the statistic T is a sufficient statistic for β

\(T = \frac{1}{n}\left( {\sum\limits_{i = 1}^n {\log \frac{1}{{1 - {X_i}}}} } \right)\)

02

Proof of T is a sufficient statistic for β

\({\bf{T = }}\frac{{\bf{1}}}{{\bf{n}}}\left( {\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{\bf{log}}\frac{{\bf{1}}}{{{\bf{1 - }}{{\bf{X}}_{\bf{i}}}}}} } \right)\)

Fisher-Neyman Factorization Theorem

\({X_1},...,{X_n}\) be i.i.dr.v. with pdf \(f\left( {x;\theta } \right)\) and let \(T = r\left( {{X_1},...,{X_n}} \right)\) be a statistic.T is sufficient statistic for \(\theta \)iff

\(f\left( {x;\theta } \right) = u\left( x \right)\nu \left( {r\left( x \right);\theta } \right)\) where \(u\,\,\,{\rm{and}}\,\,\,\nu \) are non-negative functions.

The joint pdf is given by

\(f\left( {x;\theta } \right) = {\alpha ^n}\left( \theta \right){e^{c\left( \theta \right)t}}\prod\limits_{i = 1}^n {b\left( {{x_i}} \right)} \,\,\,\,\,\,,t = \sum\limits_{i = 1}^n {d\left( {{x_i}} \right)} \)

By Fisher-Neyman Factorization Theorem we get

\(\begin{align}\nu \left( {r\left( x \right);\theta } \right) &= {\alpha ^n}\left( \theta \right){e^{c\left( \theta \right)t}};\\u\left( x \right) &= \prod\limits_{i = 1}^n {b\left( {{x_i}} \right)} \end{align}\)

It follows that \(T = \sum\limits_{i = 1}^n {d\left( {{X_i}} \right)} \) is a sufficient statistic for θ.

For the beta distribution with known parameter \(\alpha \) and unknown parameter \(\beta \) we have the following

\(\begin{align}f\left( {x;\beta } \right) &= \frac{{\left ( {{\left( {\alpha + \beta } \right) \,}} \right. }}{{\left ( {{\left( \alpha \right) \,}} \right. }}{x^{\alpha - 1}}{\left( {1 - x} \right)^{\beta - 1}}\\ &= \frac{{\left ( {{\left( {\alpha + \beta } \right) \,}} \right. }}{{\left ( {{\left( \alpha \right) \,}} \right. }}{x^{\alpha - 1}}\exp \left\{ {\left( {\beta - 1} \right)\ln \left( {1 - x} \right)} \right\}\end{align}\)

Hence the statistic

\(T' = \sum\limits_{i = 1}^n {\log \left( {1 - x} \right)} \) is a sufficient statistic.

Now \(T = \frac{1}{n}\left( {\sum\limits_{i = 1}^n {\log \frac{1}{{1 - {X_i}}}} } \right)\)is a one-one transformation of \(T' = \sum\limits_{i = 1}^n {\log \left( {1 - x} \right)} \)then we can conclude that \(T = \frac{1}{n}\left( {\sum\limits_{i = 1}^n {\log \frac{1}{{1 - {X_i}}}} } \right)\)is a sufficient statistic for \(\beta \)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the uniform distribution on the interval \(\left( {{{\bf{\theta }}_{\bf{1}}}{\bf{,}}{{\bf{\theta }}_{\bf{2}}}} \right)\) , where both \({{\bf{\theta }}_{\bf{1}}}\)and \({{\bf{\theta }}_{\bf{2}}}\) are unknown . Find the M.L.E.’s \({{\bf{\theta }}_{\bf{1}}}\) and \({{\bf{\theta }}_{\bf{2}}}\).

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}{{\bf{X}}_{\bf{2}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the uniform distribution on the interval [0, θ], where the value of the parameter θ is unknown. Suppose also that the prior distribution of θ is the Pareto distribution with parameters \({{\bf{x}}_{\bf{0}}}\) and α (\({{\bf{x}}_{\bf{0}}}\)> 0 and α > 0), as defined in Exercise 16 of Sec. 5.7. If the value of θ is to be estimated by using the squared error loss function, what is the Bayes estimator of θ? (See Exercise 18 of Sec. 7.3.)

Question : Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\)form a random sample from a distribution for which the p.d.f. f (x|θ ) is as follows:

\({\bf{f}}\left( {{\bf{x|\theta }}} \right){\bf{ = }}\frac{{\bf{1}}}{{\bf{2}}}{{\bf{e}}^{{\bf{ - }}\left| {{\bf{x - \theta }}} \right|}}\,\,\,\,{\bf{, - }}\infty {\bf{ < x < }}\infty \)

Also, suppose that the value of θ is unknown (−∞ < θ < ∞). Find the M.L.E. of θ.

In Example 7.1.6, identify the components of the statistical model as defined in Definition 7.1.1.

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the Bernoulli distribution with parameter θ, which is unknown, but it is known that θ lies in the open interval 0 <θ< 1. Show that the M.L.E. of θ does not exist if every observed value is 0 or if every observed value is 1.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.