/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 5 Let \(Y_{1}... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \(Y_{1}

Short Answer

Expert verified
The maximum likelihood estimators for \(\theta\) and \(\rho\) are \(\hat{\theta} = \frac{\min(Y)+\max(Y)}{2}\) and \(\hat{\rho} = \frac{\max(Y)-\min(Y)}{2}\) respectively. Both of these estimators are unbiased.

Step by step solution

01

Write down the pdf and likelihood function

The pdf of the uniform distribution on the interval \([\theta-\rho, \theta+\rho]\) is \(f(Y_{i} | \theta, \rho) = \frac{1}{2\rho}\) for \(\theta-\rho \leq Y_{i} \leq \theta+\rho\) and 0 otherwise. The likelihood function given the data \(Y_{1},...,Y_{n}\) is thus \(L(\theta , \rho | Y_1,...,Y_n) = (\frac{1}{2\rho})^n\) for \(\max(Y_1,...,Y_n) \leq \rho + \theta\) and \(\theta - \rho \leq \min(Y_1,...,Y_n)\) and 0 otherwise.
02

Find the Maximum Likelihood Estimators (MLEs)

To find the MLEs we need to maximize the likelihood function. Note that maximizing \(L(\theta , \rho | Y_1,...,Y_n)\) is equivalent to maximizing \(\ln(L(\theta, \rho | Y_1,...,Y_n)) = -n \log(2\rho)\) subject to the constraints \(\max(Y_1,...,Y_n) \leq \rho + \theta\) and \(\theta - \rho \leq \min(Y_1,...,Y_n)\). Solving these equations gives the MLEs \(\hat{\theta} = \frac{\min(Y)+\max(Y)}{2}\) and \(\hat{\rho} = \frac{\max(Y)-\min(Y)}{2}\).
03

Check if the MLEs are unbiased

An estimator is unbiased if its expected value is equal to the parameter it is estimating. That is, \(E(\hat{\theta}) = \theta\) and \(E(\hat{\rho}) = \rho\). It turns out that \(E(\hat{\theta}) = \theta\) and \(E(\hat{\rho}) = \rho\), so both of these estimators are unbiased.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a \(N\left(\theta, \sigma^{2}\right)\) distribution, where \(\sigma^{2}\) is fixed but \(-\infty<\theta<\infty\) (a) Show that the mle of \(\theta\) is \(\bar{X}\). (b) If \(\theta\) is restricted by \(0 \leq \theta<\infty\), show that the mle of \(\theta\) is \(\widehat{\theta}=\max \\{0, \bar{X}\\}\).

Let \(X\) have a gamma distribution with \(\alpha=4\) and \(\beta=\theta>0\). (a) Find the Fisher information \(I(\theta)\). (b) If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from this distribution, show that the mle of \(\theta\) is an efficient estimator of \(\theta\). (c) What is the asymptotic distribution of \(\sqrt{n}(\hat{\theta}-\theta) ?\)

Rao (page 368,1973 ) considers a problem in the estimation of linkages in genetics. McLachlan and Krishnan (1997) also discuss this problem and we present their model. For our purposes, it can be described as a multinomial model with the four categories \(C_{1}, C_{2}, C_{3}\), and \(C_{4} .\) For a sample of size \(n\), let \(\mathbf{X}=\left(X_{1}, X_{2}, X_{3}, X_{4}\right)^{\prime}\) denote the observed frequencies of the four categories. Hence, \(n=\sum_{i=1}^{4} X_{i} .\) The probability model is $$\begin{array}{|c|c|c|c|}\hline C_{1} & C_{2} & C_{3} & C_{4} \\ \hline \frac{1}{2}+\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4} \theta \\\\\hline\end{array}$$ where the parameter \(\theta\) satisfies \(0 \leq \theta \leq 1 .\) In this exercise, we obtain the mle of \(\theta\). (a) Show that likelihood function is given by $$L(\theta \mid \mathbf{x})=\frac{n !}{x_{1} ! x_{2} ! x_{3} ! x_{4} !}\left[\frac{1}{2}+\frac{1}{4} \theta\right]^{x_{1}}\left[\frac{1}{4}-\frac{1}{4} \theta\right]^{x_{2}+x_{3}}\left[\frac{1}{4} \theta\right]^{x_{4}}$$ (b) Show that the log of the likelihood function can be expressed as a constant (not involving parameters) plus the term $$x_{1} \log [2+\theta]+\left[x_{2}+x_{3}\right] \log [1-\theta]+x_{4} \log \theta$$ (c) Obtain the partial derivative with respect to \(\theta\) of the last expression, set the result to 0, and solve for the mle. (This will result in a quadratic equation which has one positive and one negative root.)

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be iid, each with the distribution having pdf \(f\left(x ; \theta_{1}, \theta_{2}\right)=\) \(\left(1 / \theta_{2}\right) e^{-\left(x-\theta_{1}\right) / \theta_{2}}, \theta_{1} \leq x<\infty,-\infty<\theta_{2}<\infty\), zero elsewhere. Find the maximum likelihood estimators of \(\theta_{1}\) and \(\theta_{2}\).

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a \(N\left(\mu_{0}, \sigma^{2}=\theta\right)\) distribution, where \(0<\theta<\infty\) and \(\mu_{0}\) is known. Show that the likelihood ratio test of \(H_{0}: \theta=\theta_{0}\) versus \(H_{1}: \theta \neq \theta_{0}\) can be based upon the statistic \(W=\sum_{i=1}^{n}\left(X_{i}-\mu_{0}\right)^{2} / \theta_{0}\). Determine the null distribution of \(W\) and give, explicitly, the rejection rule for a level \(\alpha\) test.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.