/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 3E Assume that the random variables... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Assume that the random variables \({X_1},...{X_n}\) form a random sample of size n from the distribution specified in that exercise, and show that the statistic T specified in the exercise is a sufficient statistic for the parameter.

1. The negative binomial distribution with parameters r and p where r is known and p is unknown. \(\left( {0 < p < 1} \right)\)\(T = \sum\limits_{i = 1}^n {{X_i}} \).

Short Answer

Expert verified

The statistic \(T = \sum\limits_{i = 1}^n {{x_i}} \) is sufficient statistic.

Step by step solution

01

Computing the pmf of Negative binomial distribution:

The probability mass function is given as

\({\rm P}\left( {X = x} \right) = \left( \begin{align}x - 1\\r - 1\end{align} \right){p^r}{\left( {1 - p} \right)^{x - r}}\)

For\(x = r,r + 1,r + 2...\)

Let\({X_1},...{X_n}\)be the random sample from the specified distribution with parameter r and p

02

Verifying statistic T is a sufficient statistic.

The likelihood function for \({X_1}...{X_n}\) is,

\(\begin{array}L\left( {{X_1}...{X_n}|p} \right) = f\left( {{X_1}...{X_n}} \right)\\ = f\left( {{X_1}} \right) \times f\left( {{X_2}} \right) \times ... \times f\left( {Xn} \right)\\ = \left[ {\left( \begin{array}{l}{x_1} - 1\\r - 1\end{array} \right){p^r}{{\left( {1 - p} \right)}^{{x_1} - r}}} \right] \times ... \times \left[ {\left( \begin{array}{x_n} - 1\\r - 1\end{array} \right){p^r}{{\left( {1 - p} \right)}^{{x_n} - r}}} \right]\end{array}\)

The resultant function is in the form of

\(g\left( {T,p} \right) \times h\left( {{x_1},...{x_n}} \right)\)

Here

\(g\left( {T,p} \right) = \left( {{p^{nr}}{{\left( {1 - p} \right)}^{\sum\limits_{i = 1}^n {{x_i} - np} }}} \right)\) and

\(h\left( {{x_1},...{x_n}} \right) = \prod\limits_{i = 1}^n {\left( \begin{align}{l}{x_1} - 1\\r - 1\end{align} \right)} \)

Hence,

\(T = \sum\limits_{i = 1}^n {{x_i}} \) is sufficient statistic for p.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that we model the lifetimes (in months) of electronic components as independent exponential random variables with unknown parameter\(\beta \)We model\(\beta \)as having the gamma distribution with parameters a and b. We believe that the mean lifetime is four months before we see any data. If we were to observe 10 components with an average observed lifetime of six months, we would then claim that the mean lifetime is five months. Determine a and b. Hint: Use Exercise 21 in Sec. 5.7.

Question :Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from a Poisson distribution for which the mean is unknown. Determine the M.L.E. of the standard deviation of the distribution.

Question: Suppose that each of two statisticians A and B mustestimate a certain parameter p whose value is unknown(0<p <1). Statistician A can observe the value of a randomvariable X, which has the binomial distribution withparameters n = 10 and p; statistician B can observe thevalue of a random variable Y, which has the negative binomialdistribution with parameters r = 4 and p. Supposethat the value observed by statistician A is X = 4 and thevalue observed by statistician B is Y = 6. Show that thelikelihood functions determined by these observed valuesare proportional, and find the common value of theM.L.E. of p obtained by each statistician.

Suppose that a random sample of size n is taken from the Bernoulli distribution with parameter θ, which is unknown, and that the prior distribution of θ is a beta distribution for which the mean is\({\mu _0}\). Show that the mean of the posterior distribution of θ will be a weighted average having the form \({\gamma _n}{\overline X _n} + \left( {1 - {\gamma _n}} \right){\mu _0}\)and show that \({\gamma _n} \to 1\)as\(n \to \infty \).

Suppose that the prior distribution of some parameter \(\theta \) is a beta distribution for which the mean is \(\frac{1}{3}\) and the variance is \(\frac{1}{{45}}\) . Determine the prior p.d.f. of \(\theta \).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.