/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 5 Show that the first order statis... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Show that the first order statistic \(Y_{1}\) of a random sample of size \(n\) from the distribution having pdf \(f(x ; \theta)=e^{-(x-\theta)}, \theta

Short Answer

Expert verified
The first order statistic \(Y_{1}\) of the given random sample is a complete sufficient statistic for \(\theta\). The unique function of this statistic which is the MVUE of \(\theta\) is \(Y_{1} - \frac{1}{n}\)

Step by step solution

01

Compute the pdf of first order statistic

The pdf for the first order statistic \(Y_1\) from a sample of size \(n\) is given by the formula \(f_{Y_1}(y) = n \cdot [1 - F(y)]^{n-1} \cdot f(y), y > \theta \), where \(F(y)\) is the cumulative distribution function of the original distribution. The cumulative distribution function \(F(y)\) of the original distribution can be computed as \(F(y) = \int_{\theta}^{y} f(x) dx = 1 - e^{-(y-\theta)}\). Now substituting \(F(y)\) and \(f(y)\) in the formula gives \(f_{Y_1}(y) = n \cdot e^{-(n-1)(y-\theta)}, y > \theta \).
02

Show \(Y_1\) is a sufficient statistic

The Neyman Factorization Theorem can now be used to show that \(Y_1\) is a sufficient statistic for \(\theta\). According to the theorem, a statistic \(T(X)\) is sufficient for \(\theta\) if the joint pdf \(f(x_1, x_2, ..., x_n;\theta)\) can be written as the product of two functions, the first being a function \(g(T(x),\theta)\) of both \(T(x)\) and \(\theta\), and the second \(h(x)\) depending only on \(x\). The joint pdf in this case can be written as the product of the individual pdfs, i.e. \(f(x_1, x_2, ..., x_n;\theta) = e^{-(nY_1-n\theta)} \cdot e^{-(n-n)}\). We can observe that the joint pdf has the required form as per Neyman Factorization Theorem and hence \(Y_1\) is a sufficient statistic.
03

Show \(Y_1\) is also a complete statistic

Showing completeness can be achieved by showing if \(E[g(T)]=0\) for all \(\theta\), then \(g(T)=0\) with probability 1. We can use the pdf of \(Y_1\) to compute the expectation. Let \(g(T)=g(Y_1)\) be a function such that \(E[g(Y_1)]=0\) for all \(\theta\). Then we must solve: \(0=\int_{\theta}^{\infty} g(y) \cdot n \cdot e^{-(n-1)(y-\theta)} dy\) for all \(\theta\). If and only if \(g(y) = 0\) for \(y > \theta\), will the integral be zero for every \(\theta\). Hence, \(Y_{1}\) is a complete statistic.
04

Find the unique function which is the MVUE of \(\theta\)

Lehmann-Scheffe Theorem states that if a statistic is complete and sufficient, then any unbiased estimator based on that statistic is the unique MVUE. From the previous steps, we know that \(Y_{1}\) is complete and sufficient. Now we need to find an unbiased estimator \(g(Y_{1})\). Unbiased estimator has the property that its expected value equals the parameter. Here, the unbiased estimator would be \(\theta\). Now we need to calculate \(E[Y_{1}]\) using \(Y_{1}\)‘s pdf. On calculating the expectation, we get \(E[Y_{1}] = \theta+ \frac{1}{n}\), hence it's a biased estimator because expectation does not equal \(\theta\). Therefore, we introduce a function of \(Y_{1}\), \(g(Y_{1}) = Y_{1} - \frac{1}{n}\) which is the unbiased estimator since \(E[g(Y_{1})] = E[Y_{1}] - \frac{1}{n} = \theta\). This function of \(Y_{1}\) is the MVUE for \(\theta\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

The pdf depicted in Figure \(7.9 .1\) is given by $$f_{m_{2}}(x)=e^{x}\left(1+m_{2}^{-1} e^{x}\right)^{-\left(m_{2}+1\right)}, \quad-\infty0\), (the pdf graphed is for \(m_{2}=0.1\) ). This is a member of a large family of pdfs, \(\log F\) -family, which are useful in survival (lifetime) analysis; see Chapter 3 of Hettmansperger and McKean (1998). (a) Let \(W\) be a random variable with pdf \((7.9 .2) .\) Show that \(W=\log Y\), where \(Y\) has an \(F\) -distribution with 2 and \(2 m_{2}\) degrees of freedom. (b) Show that the pdf becomes the logistic (6.1.8) if \(m_{2}=1\). (c) Consider the location model where $$X_{i}=\theta+W_{i} \quad i=1, \ldots, n$$ where \(W_{1}, \ldots, W_{n}\) are iid with pdf \((7.9 .2)\). Similar to the logistic location model, the order statistics are minimal sufficient for this model. Show, similar to Example \(6.1 .4\), that the mle of \(\theta\) exists.

Let a random sample of size \(n\) be taken from a distribution that has the pdf \(f(x ; \theta)=(1 / \theta) \exp (-x / \theta) I_{(0, \infty)}(x) .\) Find the mle and the MVUE of \(P(X \leq 2)\)

Prove that the sum of the observations of a random sample of size \(n\) from a Poisson distribution having parameter \(\theta, 0<\theta<\infty\), is a sufficient statistic for \(\theta\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample with the common pdf \(f(x)=\) \(\theta^{-1} e^{-x / \theta}\), for \(x>0\), zero elsewhere; that is, \(f(x)\) is a \(\Gamma(1, \theta)\) pdf. (a) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\theta\). (b) Determine the MVUE of \(\theta\). (c) Determine the mle of \(\theta\). (d) Often, though, this pdf is written as \(f(x)=\tau e^{-\tau x}\), for \(x>0\), zero elsewhere. Thus \(\tau=1 / \theta\). Use Theorem \(6.1 .2\) to determine the mle of \(\tau\). (e) Show that the statistic \(\bar{X}=n^{-1} \sum_{i=1}^{n} X_{i}\) is a complete and sufficient statistic for \(\tau\). Show that \((n-1) /(n X)\) is the MVUE of \(\tau=1 / \theta\). Hence, as usual the reciprocal of the mle of \(\theta\) is the mle of \(1 / \theta\), but, in this situation, the reciprocal of the MVUE of \(\theta\) is not the MVUE of \(1 / \theta\). (f) Compute the variances of each of the unbiased estimators in Parts (b) and (e).

. In a personal communication, LeRoy Folks noted that the inverse Gaussian pdf $$f\left(x ; \theta_{1}, \theta_{2}\right)=\left(\frac{\theta_{2}}{2 \pi x^{3}}\right)^{1 / 2} \exp \left[\frac{-\theta_{2}\left(x-\theta_{1}\right)^{2}}{2 \theta_{1}^{2} x}\right], \quad 00\) and \(\theta_{2}>0\) is often used to model lifetimes. Find the complete sufficient statistics for \(\left(\theta_{1}, \theta_{2}\right)\), if \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from the distribution having this pdf.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.