/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 23E Suppose that \({{\bf{X}}_{\bf{1}... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{, }}{\bf{. }}{\bf{. }}{\bf{. , }}{{\bf{X}}_{\bf{n}}}\)form a random sample fromthe beta distribution with parameters \({\bf{\alpha }}\) and\({\bf{\beta }}\). Let \({\bf{\theta = }}\left( {{\bf{\alpha ,\beta }}} \right)\)be the vector parameter.

a. Find the method of moments estimator for \({\bf{\theta }}\).

b. Show that the method of moments estimator is not the M.L.E.

Short Answer

Expert verified

a. method of moments estimator are\(\alpha = \frac{{{m_1}\left( {{m_1} - {m_2}} \right)}}{{{m_2} - {m_1}^2}},\beta = \frac{{\left( {1 - {m_1}} \right)\left( {{m_1} - {m_2}} \right)}}{{{m_2} - {m_1}^2}}\).

b. Maximum Likelihood Estimators are \(\alpha = \prod {{X_i}} ,\beta = \prod {\left( {1 - {X_i}} \right)} \).

Step by step solution

01

Define the pdf

It is given that X is a random variable which denotes defective items in large lots. It follows beta distribution with parameters \(\alpha \,\,and\,\,\beta \).

\(f\left( x \right) = \frac{{{x^{\alpha - 1}}{{\left( {1 - x} \right)}^{\beta - 1}}}}{{\beta \left( {\alpha ,\beta } \right)}},0 < x < 1\)

02

Finding the method of moments estimator

a. In the method of moments method,

\(\begin{align}{\mu _j}\left( \theta \right) &= E\left( {X_i^j} \right) \ldots \left( 1 \right)\\{m_j} &= \frac{1}{n}\sum\limits_{i = 1}^n {X_i^j} \ldots \left( 2 \right)\\Equating\,\,\left( 1 \right)\,\,and\,\,\left( 2 \right)\\{\mu _j} &= {m_j}\end{align}\)

By properties of beta distribution,

\(\begin{align}{m_1} &= E\left( X \right) &= \frac{\alpha }{{\alpha + \beta }}\\{m_2} &= E\left( {{X^2}} \right) &= \frac{{\alpha \left( {\alpha + 1} \right)}}{{\left( {\alpha + \beta } \right)\left( {\alpha + \beta + 1} \right)}}\end{align}\)

Now, therefore, for equating the first-order moment and second order moment with their method of moments estimator,

\(\begin{align}{m_1} &= \frac{\alpha }{{\alpha + \beta }} \ldots \left( 3 \right)\\{m_2} &= \frac{{\alpha \left( {\alpha + 1} \right)}}{{\left( {\alpha + \beta } \right)\left( {\alpha + \beta + 1} \right)}} \ldots \left( 4 \right)\end{align}\)

By solving (3),

\(\begin{align}{m_1} &= \frac{\alpha }{{\alpha + \beta }}\\\alpha &= {m_1}\left( {\alpha + \beta } \right)\\\beta &= \frac{{\alpha \left( {1 - {m_1}} \right)}}{{{m_1}}} \ldots \left( 5 \right)\end{align}\)

Substituting in (4),

\(\begin{align}{m_2} &= \frac{{\alpha \left( {\alpha + 1} \right)}}{{\left( {\alpha + \frac{{\alpha \left( {1 - {m_1}} \right)}}{{{m_1}}}} \right)\left( {\alpha + \frac{{\alpha \left( {1 - {m_1}} \right)}}{{{m_1}}} + 1} \right)}}\\ &= \frac{{\alpha \left( {\alpha + 1} \right)}}{{\left( {\frac{{\alpha {m_1} + \alpha \left( {1 - {m_1}} \right)}}{{{m_1}}}} \right)\left( {\frac{{\alpha {m_1} + \alpha \left( {1 - {m_1}} \right) + {m_1}}}{{{m_1}}}} \right)}}\\ &= \frac{{\alpha \left( {\alpha + 1} \right)}}{{\left( {\frac{\alpha }{{{m_1}}}} \right)\left( {\frac{{\alpha + {m_1}}}{{{m_1}}}} \right)}}\\ &= \frac{{{m_1}^2\left( {\alpha + 1} \right)}}{{\left( {\alpha + {m_1}} \right)}}\end{align}\)

\(\begin{align}{m_2}\left( {\alpha + {m_1}} \right) &= {m_1}^2\left( {\alpha + 1} \right)\\\alpha &= \frac{{{m_1}\left( {{m_1} - {m_2}} \right)}}{{{m_2} - {m_1}^2}}\end{align}\)

Replace the value in (5)

\(\begin{align}\beta &= \frac{{\left( {\frac{{{m_1}\left( {{m_1} - {m_2}} \right)}}{{{m_2} - {m_1}^2}}} \right)\left( {1 - {m_1}} \right)}}{{{m_1}}}\\ &= \frac{{\left( {1 - {m_1}} \right)\left( {{m_1} - {m_2}} \right)}}{{{m_2} - {m_1}^2}}\end{align}\)

03

Finding the M.L.E.

b.

Defining the likelihood function:

\(\begin{align}L\left( \theta \right) &= \prod\limits_{i = 1}^n {f\left( {{x_i}} \right)} \\ &= \prod\limits_{i = 1}^n {\frac{{{x^{\alpha - 1}}{{\left( {1 - x} \right)}^{\beta - 1}}}}{{\beta \left( {\alpha ,\beta } \right)}}} \\ &= \frac{{\prod\limits_{i = 1}^n {{x^{\alpha - 1}}} \prod\limits_{i = 1}^n {{{\left( {1 - x} \right)}^{\beta - 1}}} }}{{\beta {{\left( {\alpha ,\beta } \right)}^n}}}\end{align}\)

The log-likelihood is:

\(\ln L\left( \theta \right) = \left( {\alpha - 1} \right)\sum {\log {x_i} \times } \left( {\beta - 1} \right)\sum {\log \left( {1 - {x_i}} \right)} - n\log \beta \left( {\alpha ,\beta } \right)\)

Setting its derivative with respect to parameter:

\(\begin{align}\frac{\partial }{{\partial \alpha }}\ln L\left( \theta \right) &= \frac{\partial }{{\partial \alpha }}\left( {\alpha - 1} \right)\sum {\log {x_i} \times } \left( {\beta - 1} \right)\sum {\log \left( {1 - {x_i}} \right)} - n\log \beta \left( {\alpha ,\beta } \right)\\ &= \sum {\log {x_i}} + 0 + 0\\ &= \sum {\log {x_i}} \end{align}\)

\(\begin{align}\frac{\partial }{{\partial \beta }}\ln L\left( \theta \right) &= \frac{\partial }{{\partial \beta }}\left( {\alpha - 1} \right)\sum {\log {x_i} \times } \left( {\beta - 1} \right)\sum {\log \left( {1 - {x_i}} \right)} - n\log \beta \left( {\alpha ,\beta } \right)\\ &= 0 + \sum {\log \left( {1 - {x_i}} \right)} + 0\\ &= \sum {\log \left( {1 - {x_i}} \right)} \end{align}\)

Equating both of the equations to 0

\(\begin{align}\sum {\log {x_i}} &= 0\\\sum {\log \left( {1 - {x_i}} \right)} &= 0\end{align}\)

Applying antilog on both sides we get

\(\prod {{X_i}} \,\,and\,\,\prod {\left( {1 - {X_i}} \right)} \)

The log likelihood function is a decreasing function and it is maximised at \(\begin{align}\alpha &= \prod {{X_i}} \\\beta &= \prod {\left( {1 - {X_i}} \right)} \end{align}\)

Therefore, the M.L.E. is

\(\begin{align}\alpha &= \prod {{X_i}} \\\beta &= \prod {\left( {1 - {X_i}} \right)} \end{align}\)

Therefore, Method of moments estimator is\(\alpha = \frac{{{m_1}\left( {{m_1} - {m_2}} \right)}}{{{m_2} - {m_1}^2}}\)\(\beta = \frac{{\left( {1 - {m_1}} \right)\left( {{m_1} - {m_2}} \right)}}{{{m_2} - {m_1}^2}}\)and M.L.E. is\(\alpha = \prod {{X_i}} ,\beta = \prod {\left( {1 - {X_i}} \right)} \)

They are not equal.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Show that the family of beta distributions is a conjugate

family of prior distributions for samples from a negative binomial distribution with a known value of the parameterrand an unknown value of the parameterp(0<p <1).

Consider again the conditions of Exercise 10, and assume the same prior distribution of θ. Suppose now, however, that six observations are selected at random from the uniform distribution on the interval \(\left( {{\bf{\theta - }}\frac{{\bf{1}}}{{\bf{2}}}{\bf{,\theta + }}\frac{{\bf{1}}}{{\bf{2}}}} \right)\), and their values are 11.0, 11.5, 11.7, 11.1, 11.4, and 10.9. Determine the posterior distribution of θ.

Question: Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from a normal distribution for which the mean μ is known, but the variance \({\sigma ^2}\) is unknown. Find the M.L.E. of\({\sigma ^2}\).

Suppose that a random sample is to be taken from a normal distribution for which the value of the mean θ is unknown and the standard deviation is 2, the prior distribution of θ is a normal distribution for which the standard deviation is 1, and the value of θ must be estimated by using the squared error loss function. What is the smallest random sample that must be taken in order for the mean squared error of the Bayes estimator of θ to be 0.01 or less? (See Exercise 10 of Sec. 7.3.)

Question: Prove that the method of moments estimators of themean and variance of a normal distribution are also the M.L.E.’s.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.