/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q11E Suppose that a random sample of ... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that a random sample of size n is taken from an exponential distribution for which the value of the parameter θ is unknown, the prior distribution of θ is a specified gamma distribution, and the value of θ must be estimated by using the squared error loss function. Show that the Bayes estimators, for n = 1, 2,..., form a consistent sequence of estimators of θ.

Short Answer

Expert verified

It is proved that the Bayes estimators, for n = 1, 2,..., form a consistent sequence of estimators of θ.

Step by step solution

01

Given information

A random sample \({X_1},{X_2},...,{X_n}\) of size n is taken from exponential distribution with unknown parameter\(\theta \).

02

Calculating the Bayes estimator

Let prior distribution of \(\theta \)be gamma distribution with parameters \(\alpha > 0\) and\(\beta > 0\).

Then the posterior distribution of\(\theta \),\(\xi \left( {\theta |{x_1},{x_2},...{x_n}} \right)\)is also gamma distribution with parameters\(\alpha + n\)and\(\beta + \sum\limits_{i = 1}^n {{x_i}} \).

So,

\(E\left[ {L\left( {\theta ,\delta *\left( x \right)} \right)|x} \right] = \mathop {\min }\limits_a E\left[ {L\left( {\theta ,a} \right)|x} \right]\)

Consider a random sample of size n taken from a exponential distribution with unknown parameter\(\theta \).

Consider that prior distribution of\(\theta \)is gamma with parameters\(\alpha \)and\(\beta \).

Thus, the mean of the prior distribution is:

\({\mu _0} = \frac{\alpha }{\beta }\) ………………………………….. (1)

Due to conjugate pairs of posterior and prior distributions, the posterior distribution of\(\theta \),\(\xi \left( {\theta |{x_1},{x_2},...{x_n}} \right)\), is also gamma with parameters\(\alpha + n\)and\(\beta + \sum\limits_{i = 1}^n {{x_i}} \).

So, mean of posterior distribution is:

\({\mu _1} = \frac{{\alpha + n}}{{\beta + \sum\limits_{i = 1}^n {{x_i}} }}\)

\(\begin{array}{l} = \frac{{{\raise0.7ex\hbox{${\left( {\alpha + n} \right)}$} \!\mathord{\left/

{\vphantom {{\left( {\alpha + n} \right)} n}}\right.\kern-\nulldelimiterspace}

\!\lower0.7ex\hbox{$n$}}}}{{{\raise0.7ex\hbox{${\left( {\beta + \sum\limits_{i = 1}^n {{x_i}} } \right)}$} \!\mathord{\left/

{\vphantom {{\left( {\beta + \sum\limits_{i = 1}^n {{x_i}} } \right)} n}}\right.\kern-\nulldelimiterspace}

\!\lower0.7ex\hbox{$n$}}}}\\ = \frac{{1 + {\raise0.7ex\hbox{$\alpha $} \!\mathord{\left/

{\vphantom {\alpha n}}\right.\kern-\nulldelimiterspace}

\!\lower0.7ex\hbox{$n$}}}}{{{{\overline x }_n} + {\raise0.7ex\hbox{$\beta $} \!\mathord{\left/

{\vphantom {\beta n}}\right.\kern-\nulldelimiterspace}

\!\lower0.7ex\hbox{$n$}}}}\end{array}\) ..……………………………………. (2)

Let’s apply limit\(n \to \infty \)to Bayes estimator of\(\theta \)computed in equation (2).

We understand that as\(n \to \infty \),\({\raise0.7ex\hbox{$\alpha $} \!\mathord{\left/

{\vphantom {\alpha n}}\right.\kern-\nulldelimiterspace}

\!\lower0.7ex\hbox{$n$}} \to 0\)and\({\raise0.7ex\hbox{$\beta $} \!\mathord{\left/

{\vphantom {\beta n}}\right.\kern-\nulldelimiterspace}

\!\lower0.7ex\hbox{$n$}} \to 0\)

So,

\(\mathop {\lim }\limits_{n \to \infty } {\mu _1} = \frac{1}{{{{\overline x }_n}}}\) …………………………………….. (3)

From law of large number we know that\(\frac{1}{{{{\overline x }_n}}}\)converges in probability to\(\theta \)as mean of exponential distribution is\(\frac{1}{\theta }\).

So, from equation (3) it follows those Bayes estimators for\(n = 1,2,...\)form a consistent sequence of estimators of\(\theta \)when the loss function is squared error function.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question : Suppose that the two-dimensional vectors \(\left( {{{\bf{X}}_{\bf{1}}}{\bf{,}}{{\bf{Y}}_{\bf{1}}}} \right){\bf{,}}\left( {{{\bf{X}}_{\bf{2}}}{\bf{,}}{{\bf{Y}}_{\bf{2}}}} \right){\bf{,}}...{\bf{,}}\left( {{{\bf{X}}_{\bf{n}}}{\bf{,}}{{\bf{Y}}_{\bf{n}}}} \right)\) form a random sample from a bivariate normal distribution for which the means of X and Y are unknown but the variances of X and Y and the correlation between X and Y are known. Find the M.L.E.’s of the means.

In Example 5.8.3 (page 328), identify the components of the statistical model as defined in Definition 7.1.1.

For a distribution with mean μ = 0 and standard deviation>0, the coefficient of variation of the distributionis defined as σ/|μ|. Consider again the problem describedin Exercise 12, and suppose that the coefficient of variationof the prior gamma distribution of θ is 2.What is thesmallest number of customers that must be observed in orderto reduce the coefficient of variation of the posteriordistribution to 0.1?

In Example 7.1.6, identify the components of the statistical model as defined in Definition 7.1.1.

Suppose that we model the lifetimes (in months) of electronic components as independent exponential random variables with unknown parameter\(\beta \)We model\(\beta \)as having the gamma distribution with parameters a and b. We believe that the mean lifetime is four months before we see any data. If we were to observe 10 components with an average observed lifetime of six months, we would then claim that the mean lifetime is five months. Determine a and b. Hint: Use Exercise 21 in Sec. 5.7.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.