/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q12E Question: Suppose that a certain... [FREE SOLUTION] | 91影视

91影视

Question: Suppose that a certain large population contains k different types of individuals (k 鈮 2), and let \({{\bf{\theta }}_{\bf{i}}}\)denote the proportion of individuals of type i, for i = 1,...,k. Here, 0 鈮 \({\theta _i}\)鈮 1 and\({{\bf{\theta }}_{\bf{1}}}{\bf{ + }}...{\bf{ + }}{{\bf{\theta }}_{\bf{k}}}{\bf{ = 1}}\). Suppose also that in a random sample of n individuals from this population, exactly ni individuals are of type i, where\({{\bf{n}}_{\bf{1}}}{\bf{ + }}...{\bf{ + }}{{\bf{n}}_{\bf{k}}}{\bf{ = n}}\). Find the M.L.E.鈥檚 of \({{\bf{\theta }}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{\theta }}_{\bf{n}}}\)

Short Answer

Expert verified

The M.L.E of each parameter is \({\hat \theta _i} = \frac{{{n_i}}}{n}\)

Step by step solution

01

Given information

In a population \({\theta _i}\) denote the proportion of individuals of type i, for i = 1,...,k. Here, 0 鈮 \({\theta _i}\) 鈮 1 and\({\theta _1} + ... + {\theta _k} = 1\) as well as \({n_1} + ... + {n_k} = n\). W e need to calculate the M.L.E. of \({\theta _1},...,{\theta _n}\)

02

Calculation of M.L.E. of \({{\bf{\theta }}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{\theta }}_{\bf{n}}}\)

Let \(f\left( x \right) = \theta _1^{{n_1}}...\theta _n^{{n_k}}\)and \(L\left( {{\theta _1}...{\theta _K}} \right) = \log f\left( x \right)\,\,\,and\,\,{\theta _K} = 1 - \sum\limits_{i = 1}^n {{\theta _i}} \)

By maximizing the log of the likelihood function we will get the M.L.E.

Thus,

\(\begin{array}{c}L\left( \theta \right) = \log f\left( {x;\theta } \right)\\ = \sum\limits_{i = 1}^n {{n_i}\log {\theta _i}} \end{array}\)

Now, let us take the partial derivative

\(\frac{{\partial L\left( \theta \right)}}{{\partial \theta }} = \frac{{{n_i}}}{{{\theta _i}}} - \frac{{{n_k}}}{{{\theta _k}}}\)

From the equation of partial derivatives to zero the following system of equations is obtained

\(\frac{{{\theta _1}}}{{{n_1}}} = \frac{\theta }{{{n_2}}} = ... = \frac{{{\theta _k}}}{{{n_k}}} = c\)

Now, from the fact \(\sum\limits_{i = 1}^n {{\theta _i}} = 1\) and \({\theta _i} = c{n_i}\) we get

\(1 = \sum\limits_{i = 1}^k {{\theta _i}} = cn\). So the maximum is obtained when \(c = \frac{1}{n}\)

Hence the M.L.E. is \({\hat \theta _i} = \frac{{{n_i}}}{n}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

The uniform distribution on the integers\(1,2,3...\theta \), as defined in Sec. 3.1, where the value of\(\theta \)is unknown\(\left( {\theta = 1,2...} \right);T = \max \left( {{X_{1,}}...{X_n}} \right)\).

Question: Suppose that each of two statisticians A and B mustestimate a certain parameter \({\bf{\theta }}\) whose value is unknown(\({\bf{\theta }}\)> 0). Statistician A can observe the value of a randomvariable X, which has the gamma distribution with parameters\({\bf{\alpha }}\,\,{\bf{and}}\,\,{\bf{\beta }}\), where\({\bf{\alpha = 3}}\,\,{\bf{and}}\,\,{\bf{\beta = \theta }}\); statistician Bcan observe the value of a random variable Y, which hasthe Poisson distribution with mean \({\bf{\theta }}\). Suppose that thevalue observed by statistician A is X = 2 and the value observedby statistician B is Y = 3. Show that the likelihoodfunctions determined by these observed values are proportional,and find the common value of the M.L.E. of \({\bf{\theta }}\)obtained by each statistician.

Suppose that we model the lifetimes (in months) of electronic components as independent exponential random variables with unknown parameter\(\beta \)We model\(\beta \)as having the gamma distribution with parameters a and b. We believe that the mean lifetime is four months before we see any data. If we were to observe 10 components with an average observed lifetime of six months, we would then claim that the mean lifetime is five months. Determine a and b. Hint: Use Exercise 21 in Sec. 5.7.

Question: In a clinical trial, let the probability of successful outcome have a prior distribution that is the uniform distribution on the interval\(\left[ {0,1} \right]\), which is also the beta distribution with parameters 1 and 1. Suppose that the first patient has a successful outcome. Find the Bayes estimates of that would be obtained for both the squared error and absolute error loss functions.

Consider the data in Example 7.3.10. This time, suppose that we use the improper prior 鈥減.d.f.鈥漒(\xi \left( \theta \right) = 1\)(for all 胃). Find the posterior distribution of\(\theta \)and the posterior probability that\(\theta > 1\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.