/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q 17E Let 胃 be a parameter, and let X... [FREE SOLUTION] | 91影视

91影视

Let 胃 be a parameter, and let X be discrete with p.f. \({\bf{f}}\left( {{\bf{x|\theta }}} \right)\) conditional on 胃. Let T = r(X) be a statistic. Prove that T is sufficient if and only if, for every t and every x such that t = r(x), the likelihood function from observing T = t is proportional to the likelihood function from observing X = x.

Short Answer

Expert verified

T is sufficient if and only if, for every t and every x such that t = r(x), the likelihood function from observing T = t is proportional to the likelihood function from observing X = x

Step by step solution

01

Given information

胃 be a parameter, and let X be discrete with p.f. \(f\left( {x|\theta } \right)\) conditional on 胃. Let T = r(X) be a statistic. We need to prove that T is sufficient if and only if, for every t and every x such that t = r(x), the likelihood function from observing T = t is proportional to the likelihood function from observing X = x

02

Proof of T is sufficient if and only if, for every t and every x such that t = r(x), the likelihood function from observing T = t is proportional to the likelihood function from observing X = x

Fisher-Neyman Factorization Theorem

\({X_1},...,{X_n}\) be i.i.d r.v. with pdf \(f\left( {x;\theta } \right)\) and let \(T = r\left( {{X_1},...,{X_n}} \right)\) be a statistic.T is sufficient statistic for \(\theta \)iff

\(f\left( {x;\theta } \right) = u\left( x \right)\nu \left( {r\left( x \right);\theta } \right)\) where \(u\,\,\,{\rm{and}}\,\,\,\nu \) are non-negative functions.

The likelihood function is

\(\begin{align}g\left( {t;\theta } \right) &= \sum\limits_{r\left( x \right) = t}^{} {u\left( x \right)\nu \left( {r\left( x \right);\theta } \right)} \\ &= \sum\limits_{r\left( x \right) = t}^{} {u\left( x \right)\nu } \left( {t;\theta } \right)\\ &= \nu \left( {t;\theta } \right)\sum\limits_{r\left( x \right) = t}^{} {u\left( x \right)} \end{align}\)

The likelihood function is proportional to \(\nu \left( {r\left( x \right);\theta } \right) = \nu \left( {t;\theta } \right)\).

Let us assume there exists a function h such that

\(f\left( {x;\theta } \right) = h\left( x \right)g\left( {t;\theta } \right)\)

Now, by the Fisher-Neyman Factorization Theorem

\(\begin{align}\nu \left( {r\left( x \right);{x_0}} \right) &= g\left( {t;\theta } \right)\\u\left( x \right) &= h\left( x \right)\end{align}\)

Hence we arrives at the proof of T is sufficient if and only if, for every t and every x such that t = r(x), the likelihood function from observing T = t is proportional to the likelihood function from observing X = x

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let denote the average number of defects per 100 feet of a certain type of magnetic tape. Suppose that thevalue ofis unknown and that the prior distribution ofis the gamma distribution with parameters=2 and

=10. When a 1200-foot roll of this tape is inspected,exactly four defects are found. Determine the posteriordistribution of.

Consider the data in Example 7.3.10. This time, suppose that we use the improper prior 鈥減.d.f.鈥漒(\xi \left( \theta \right) = 1\)(for all 胃). Find the posterior distribution of\(\theta \)and the posterior probability that\(\theta > 1\).

Question: Consider again the conditions in Exercise 2, but suppose also that it is known that\(\)\(\frac{{\bf{1}}}{{\bf{2}}} \le {\bf{p}} \le \frac{{\bf{2}}}{{\bf{3}}}\). If the observations in the random sample of 70 purchases are as given in Exercise 2, what is the M.L.E. of p?

Suppose that the time in minutes required to serve a customer at a certain facility has an exponential distribution for which the value of the parameter 胃 is unknown, the prior distribution of 胃 is a gamma distribution for which the mean is 0.2 and the standard deviation is 1, and the average time required to serve a random sample of 20 customers is observed to be 3.8 minutes. If the squared error loss function is used, what is the Bayes estimate of 胃?

Show that in Example 7.3.2 it must be true that V 鈮 0.01 after 22 items have been selected. Also show that V > 0.01 until at least seven items have been selected.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.