Chapter 6: Problem 2
Given \(f(x ; \theta)=1 / \theta, 0
Short Answer
Step by step solution
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 6: Problem 2
Given \(f(x ; \theta)=1 / \theta, 0
These are the key concepts you need to understand to accurately answer the question.
All the tools & learning materials you need for study success - in one app.
Get started for free
Rao (page 368,1973 ) considers a problem in the estimation of linkages in genetics. McLachlan and Krishnan (1997) also discuss this problem and we present their model. For our purposes it can be described as a multinomial model with the four categories \(C_{1}, C_{2}, C_{3}\) and \(C_{4}\). For a sample of size \(n\), let \(\mathbf{X}=\left(X_{1}, X_{2}, X_{3}, X_{4}\right)^{\prime}\) denote the observed frequencies of the four categories. Hence, \(n=\sum_{i=1}^{4} X_{i} .\) The probability model is $$\begin{array}{|c|c|c|c|}\hline C_{1} & C_{2} & C_{3} & C_{4} \\ \hline \frac{1}{2}+\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4}-\frac{1}{4} \theta & \frac{1}{4} \theta \\\\\hline\end{array}$$ where the parameter \(\theta\) satisfies \(0 \leq \theta \leq 1 .\) In this exercise, we obtain the mle of \(\theta\). (a) Show that likelihood function is given by $$L(\theta \mid \mathbf{x})=\frac{n !}{x_{1} ! x_{2} ! x_{3} ! x_{4} !}\left[\frac{1}{2}+\frac{1}{4} \theta\right]^{x_{1}}\left[\frac{1}{4}-\frac{1}{4} \theta\right]^{x_{2}+x_{3}}\left[\frac{1}{4} \theta\right]^{x_{4}}$$ (b) Show that the log of the likelihood function can be expressed as a constant (not involving parameters) plus the term $$ x_{1} \log [2+\theta]+\left[x_{2}+x_{3}\right] \log [1-\theta]+x_{4} \log \theta $$ (c) Obtain the partial of the last expression, set the result to 0, and solve for the mle. (This will result in a quadratic equation which has one positive and one negative root.)
Let \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from a \(N\left(\mu_{0}, \sigma^{2}=\theta\right)\) distribution, where \(0<\theta<\infty\) and \(\mu_{0}\) is known. Show that the likelihood ratio test of \(H_{0}: \theta=\theta_{0}\) versus \(H_{1}: \theta \neq \theta_{0}\) can be based upon the statistic \(W=\sum_{i=1}^{n}\left(X_{i}-\mu_{0}\right)^{2} / \theta_{0}\) Determine the null distribution of \(W\) and give, explicitly, the rejection rule for a level \(\alpha\) test.
Prove that \(\bar{X}\), the mean of a random sample of size \(n\) from a distribution that is \(N\left(\theta, \sigma^{2}\right),-\infty<\theta<\infty\), is, for every known \(\sigma^{2}>0\), an efficient estimator of \(\theta\).
Suppose \(X_{1}, X_{2}, \ldots, X_{n_{1}}\) are a random sample from a \(N(\theta, 1)\) distribution. Suppose \(Z_{1}, Z_{2}, \ldots, Z_{n_{2}}\) are missing observations. Show that the first step EM estimate is $$\hat{\theta}^{(1)}=\frac{n_{1} \bar{x}+n_{2} \widehat{\theta}^{(0)}}{n}$$ where \(\widehat{\theta}^{(0)}\) is an initial estimate of \(\theta\) and \(n=n_{1}+n_{2} .\) Note that if \(\widehat{\theta}^{(0)}=\bar{x}\), then \(\widehat{\theta}^{(k)}=\bar{x}\) for all \(k\)
Let \(X\) be \(N(0, \theta), 0<\theta<\infty\) (a) Find the Fisher information \(I(\theta)\). (b) If \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from this distribution, show that the mle of \(\theta\) is an efficient estimator of \(\theta\). (c) What is the asymptotic distribution of \(\sqrt{n}(\widehat{\theta}-\theta) ?\)
What do you think about this solution?
We value your feedback to improve our textbook solutions.