/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q3E A Pareto distribution (see Exerc... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

A Pareto distribution (see Exercise 16 of Sec. 5.7) for which both parameters \({x_0}\) and \(\alpha \)are unknown \(\left( {{x_0} > 0\,\,\,{\rm{and}}\,\,\,\alpha {\rm{ > 0}}} \right)\);\({T_1} = \min \left\{ {{X_1},...,{X_n}} \right\}\) and \({T_2} = \prod\limits_{i = 1}^n {{x_i}} \)

Short Answer

Expert verified

\({T_1} = \min \left\{ {{X_1},...,{X_n}} \right\}\) and \({T_2} = \prod\limits_{i = 1}^n {{x_i}} \) are jointly sufficient statistics

Step by step solution

01

Given information

A Pareto distribution for which both parameters \({x_0}\) and \(\alpha \) are unknown. We need to show that \({T_1} = \min \left\{ {{X_1},...,{X_n}} \right\}\) and \({T_2} = \prod\limits_{i = 1}^n {{x_i}} \) are jointly sufficient statistics

02

Proof of \({T_1} = \min \left\{ {{X_1},...,{X_n}} \right\}\)  and \({T_2} = \prod\limits_{i = 1}^n {{x_i}} \) are jointly sufficient statistics

Fisher-Neyman Factorization Theorem

\({X_1},...,{X_n}\) be i.i.dr.v. with pdf \(f\left( {x;\theta } \right)\) and let \(T = r\left( {{X_1},...,{X_n}} \right)\) be a statistic .T is sufficient statistic for \(\theta \)iff

\(f\left( {x;\theta } \right) = u\left( x \right)\nu \left[ {r\left( x \right);\theta } \right]\) where \(u\,\,\,{\rm{and}}\,\,\,\nu \) are non-negative functions.

The pdf of the pareto distribution is

\(\begin{array}{c}f\left( {x;\alpha } \right) = \frac{{\alpha {x_0}}}{{{x^{\alpha + 1}}}}\,\,\,\,,x \ge {x_0}\\ = 0\,\,\,\,\,x < {x_0}\end{array}\)

The joint pdf of pareto distribution is

\(\begin{array}{c}f\left( {x;{x_0},\alpha } \right) = \prod\limits_{i = 1}^n {\left[ {\frac{{\alpha {x_0}^\alpha }}{{{x^{\alpha + 1}}}}} \right]} \\ = \frac{{{{\left( {\alpha {x_0}^\alpha } \right)}^n}}}{{{{\left( {\prod\limits_{i = 1}^n {{x_i}} } \right)}^{\alpha + 1}}}}\end{array}\)

\(f\left( {x;{x_0},\alpha } \right) = t_2^{ - \left( {\alpha + 1} \right)}{\alpha ^n}x_0^{\alpha n}\)

When \({t_1} = \min \left\{ {{x_1},...,{x_n}} \right\}\) and \({t_2} = \prod\limits_{i = 1}^n {{x_i}} \) then by using the Fisher-Neyman Factorization Theorem

\(\begin{array}{c}f\left( {x;{x_0},\alpha } \right) = \frac{{{{\left( {\alpha {x_0}^\alpha } \right)}^n}}}{{{{\left( {\prod\limits_{i = 1}^n {{x_i}} } \right)}^{\alpha + 1}}}}g\left( {{x_0},\min \left\{ {{x_1},{x_2},{x_3},......{x_n}} \right\}} \right)\\ = \frac{{{{\left( {\alpha {x_0}^\alpha } \right)}^n}}}{{{{\left( {{t_2}} \right)}^{^{\alpha + 1}}}}}g\left( {{x_0},{t_1}} \right)\end{array}\)

Where g is the indicator function such that

\(\begin{array}{c}g\left( {x,y} \right) = 1\,\,\,\,\,,x \le y\\ = 0\,\,\,\,\,x > y\end{array}\)

Now,

\(\nu \left[ {{r_1}\left( x \right){r_2}\left( x \right);{x_0},\alpha } \right] = t_2^{ - \left( {\alpha + 1} \right)}{\alpha ^n}x_0^{\alpha n}g\left( {{x_0},{t_1}} \right)\)and \(u\left( x \right) = 1\)where

\(\begin{array}{c}g\left( {{x_0},{t_1}} \right) = 1\,\,\,\,\,,{x_0} \le {t_1}\\ = 0\,\,\,\,\,,{x_0} > {t_1}\end{array}\)

It follows that\({T_1} = \min \left\{ {{X_1},...,{X_n}} \right\}\) and \({T_2} = \prod\limits_{i = 1}^n {{x_i}} \) are jointly sufficient statistics.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the beta distribution with parameters α and β, where the value of α is known and the value of β is unknown (β > 0). Show that the following statistic T is a sufficient statistic for β

\({\bf{T = }}\frac{{\bf{1}}}{{\bf{n}}}\left( {\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{\bf{log}}\frac{{\bf{1}}}{{{\bf{1 - }}{{\bf{X}}_{\bf{i}}}}}} } \right)\)

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}{{\bf{X}}_{\bf{2}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the uniform distribution on the interval [0, θ], where the value of the parameter θ is unknown. Suppose also that the prior distribution of θ is the Pareto distribution with parameters \({{\bf{x}}_{\bf{0}}}\) and α (\({{\bf{x}}_{\bf{0}}}\)> 0 and α > 0), as defined in Exercise 16 of Sec. 5.7. If the value of θ is to be estimated by using the squared error loss function, what is the Bayes estimator of θ? (See Exercise 18 of Sec. 7.3.)

Suppose that the time in minutes required to serve a customer at a certain facility has an exponential distribution for which the value of the parameter θ is unknown, the prior distribution of θ is a gamma distribution for which the mean is 0.2 and the standard deviation is 1, and the average time required to serve a random sample of 20 customers is observed to be 3.8 minutes. If the squared error loss function is used, what is the Bayes estimate of θ?

Identify two statistical inferences mentioned in Example 7.1.3.

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from a normal distribution for which the mean μ is known, but the variance \({\sigma ^2}\) is unknown. Find the M.L.E. of\({\sigma ^2}\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.