/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 2E Assume that the random variables... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Assume that the random variables\({X_1},...{X_n}\)form a random sample of size n from the distribution specified in that exercise, and show that the statistic T specified in the exercise is a sufficient statistic for the parameter.

1. The Geometric distribution with parameter\(p\), which is unknown\(\left( {0 < p < 1} \right)\)\(T = \sum\limits_{i = 1}^n {{X_i}} \)

Short Answer

Expert verified

The statistic \(T = \sum\limits_{i = 1}^n {{x_i}} \) is sufficient statistic.

Step by step solution

01

Explaining Geometric distribution:

Consider an experiment with only two results.

Let X be the number of trials required until first success is obtained.

Let P be the probability of getting desired result in a single trial.

Then X is said to follow geometric distribution and the probability mass function is \({\rm P}\left( {X = x} \right) = p{\left( {1 - p} \right)^x}\;;\;x \ge 1\)

02

Defining the sufficient statistics:

Let \({X_1},...{X_n}\) be the random sample from the specified distribution with parameter \(\theta \).

Let T be the statistic.

If for every parameter \(\theta \) and for all possible values of the conditional joint distribution of \({X_1},...{X_n}\) given\(T = t\), is only depend on the values of t but not on the parameter \(\theta \) and this condition satisfy for all \(\theta \) then T can be a sufficient statistic for the parameter \(\theta \)

03

Defining the factorization criterion:

Let \({X_1},...{X_n}\) be the random sample from the specified distribution with unknown parameter \(\theta \in \Omega \).with the pdf \(f\left( {x|\theta } \right)\)

\(T = r\left( {{X_1},...{X_n}} \right)\) is sufficient statistic if and only if joint pdf \(f\left( {x|\theta } \right)\) can be factored as \({f_n}\left( {x|\theta } \right) = u\left( x \right)v\left{ {r\left( x \right),\theta } \right)\)

04

Verifying statistic T is a sufficient statistic.

Let \(X = {X_1},...{X_n}\)are the random samples form the Bernoulli distribution

The pdf is \(f\left( {x,p} \right) = \left\{ \begin{align}p{\left( {1 - p} \right)^x}\;\;\;for\;x = 0,1\\0\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;otherwise\end{align} \right.\)

T be the value of statistic when the observed values of \(X = {X_1},...{X_n}\)are \(x = {x_1},{x_2}...{x_n}\)

The joint pdf is given as:

\(\begin{align}{f_n}\left( {x,p} \right) &= p{\left( {1 - p} \right)^{{x_1}}} \times p{\left( {1 - p} \right)^{{x_2}}}...p{\left( {1 - p} \right)^{{x_n}}}\\{f_n}\left( {x,p} \right) &= {p^n}{\left( {1 - p} \right)^{\sum\limits_{i = 1}^n {{x_i}} }}\\{f_n}\left( {x,p} \right) &= {p^n}{\left( {1 - p} \right)^{r\left( x \right)}}\end{align}\)

Using factorization theorem assume that\(p > 0\)and from the joint pdf it has

\(u\left( x \right) = 1\)and

\(v\left{ {r\left( x \right),\theta } \right) = {p^n}{\left( {1 - p} \right)^{\sum\limits_{i = 1}^n {{x_i}} }}\)

Where\(r\left( x \right) = \sum\limits_{i = 1}^n {{x_i}} \).

Therefore, the statistic\(T = \sum\limits_{i = 1}^n {{x_i}} \)is sufficient statistic.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that in Exercise 15 the parameter is taken asthe standard deviation of the normal distribution, ratherthan the variance. Determine a conjugate family of priordistributions for samples from a normal distribution witha known value of the meanμand an unknown value ofthe standard deviationσ.

Question: In a clinical trial, let the probability of successful outcome θ have a prior distribution that is the uniform distribution on the interval\(\left[ {0,1} \right]\), which is also the beta distribution with parameters 1 and 1. Suppose that the first patient has a successful outcome. Find the Bayes estimates of θ that would be obtained for both the squared error and absolute error loss functions.

Question: Suppose that each of two statisticians A and B mustestimate a certain parameter p whose value is unknown(0<p <1). Statistician A can observe the value of a randomvariable X, which has the binomial distribution withparameters n = 10 and p; statistician B can observe thevalue of a random variable Y, which has the negative binomialdistribution with parameters r = 4 and p. Supposethat the value observed by statistician A is X = 4 and thevalue observed by statistician B is Y = 6. Show that thelikelihood functions determined by these observed valuesare proportional, and find the common value of theM.L.E. of p obtained by each statistician.

Question: Consider again the conditions in Exercise 2, but suppose also that it is known that\(\)\(\frac{{\bf{1}}}{{\bf{2}}} \le {\bf{p}} \le \frac{{\bf{2}}}{{\bf{3}}}\). If the observations in the random sample of 70 purchases are as given in Exercise 2, what is the M.L.E. of p?

Suppose that the proportion θ of defective items in a large shipment is unknown and that the prior distribution of θ is the beta distribution with parameters 2 and 200. If 100 items are selected at random from the shipment and if three of these items are found to be defective, what is the posterior distribution of θ?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.