/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q3E Question: Consider again the con... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Question: Consider again the conditions in Exercise 2, but suppose also that it is known that\(\)\(\frac{{\bf{1}}}{{\bf{2}}} \le {\bf{p}} \le \frac{{\bf{2}}}{{\bf{3}}}\). If the observations in the random sample of 70 purchases are as given in Exercise 2, what is the M.L.E. of p?

Short Answer

Expert verified

\(p = \frac{2}{3}\)

Step by step solution

01

Given information

In the exercise 2 the sample size is 70,\(\frac{1}{2} \le p \le \frac{2}{3}\) we need to find out the M.L.E. of p.

02

Step-2: Calculation of M.L.E. of p

We are given that \(\frac{1}{2} \le p \le \frac{2}{3}\)and we are interested to determine the value of p as maximum likelihood estimator. The maximum likelihood estimator function of p follows Bernoulli distribution with the probability mass function\(f\left( x \right) = {p^x}{\left( {1 - p} \right)^{n - x}}\) where p denote probability of success .Here x= 58 and n= 70. The function along with the given interval is maximum when \(p = \frac{2}{3}\).

So, the required M.L.E. is \(p = \frac{2}{3}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) form a random sample from the gamma distribution specified in Exercise 6. Show that the statistic \({\bf{T = }}\sum\limits_{{\bf{i = 1}}}^{\bf{n}} {{\bf{log}}{{\bf{x}}_{\bf{i}}}} \) is a sufficient statistic for the parameter\({\bf{\alpha }}\).

Suppose that a random sample \({{\bf{X}}_{\bf{1}}}{\bf{,}}...{\bf{,}}{{\bf{X}}_{\bf{n}}}\) is drawn from the Pareto distribution with parameters \({{\bf{x}}_{\bf{0}}}\,\,\,\,{\bf{and}}\,\,\,{\bf{\alpha }}\).

a. If \({{\bf{x}}_{\bf{0}}}\) is known and \({\bf{\alpha > 0}}\) unknown, find a sufficient statistic

b. If \({\bf{\alpha }}\) is known and \({{\bf{x}}_{\bf{0}}}\) unknown, find a sufficient statistic.

Assume that the random variables \({X_1},...{X_n}\) form a random sample of size n from the distribution specified in that exercise, and show that the statistic T specified in the exercise is a sufficient statistic for the parameter.

1. The negative binomial distribution with parameters r and p where r is known and p is unknown. \(\left( {0 < p < 1} \right)\)\(T = \sum\limits_{i = 1}^n {{X_i}} \).

Consider a distribution for which the pdf. or the p.f. is \(f\left( {x|\theta } \right)\) , where the parameter θ is a k-dimensional vector belonging to some parameter space\(\Omega \) . It is said that the family of distributions indexed by the values of θ in\(\Omega \) is a k-parameter exponential family, or a k-parameter Koopman-Darmois family, if \(f\left( {x|\theta } \right)\)can be written as follows for \(\theta \in \Omega \)and all values of x:

\(f\left( {x|\theta } \right) = a\left( \theta \right)b\left( x \right)\exp \left( {\sum\limits_{i = 1}^k {{c_i}\left( \theta \right){d_i}\left( x \right)} } \right)\)

Here, a and \({c_1},...,{c_k}\) are arbitrary functions of θ, and b and \({d_1},...,{d_k}\) are arbitrary functions of x. Suppose now that \({X_1},...,{X_n}\) form a random sample from a distribution which belongs to a k-parameter exponential family of this type, and define the k statistics \({T_1},...,{T_k}\) as follows:

\({T_i} = \sum\limits_{j = 1}^n {{d_i}\left( {{X_j}} \right)} \)

Show that the statistics \({T_1},...,{T_k}\)are jointly sufficient statistics for θ.

Let ξ(θ)be a p.d.f. that is defined as follows for constants

α >0 andβ >0:

\(\xi \left( \theta \right) = \left\{ \begin{aligned}{l}\frac{{{\beta ^\alpha }}}{{\Gamma \left( \alpha \right)}}{\theta ^{ - \left( {\alpha + 1} \right)}}{e^{ - \beta /\theta }}\,for\,\theta > 0\\0\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,for\,\,\,\theta \le 0\end{aligned} \right.\)

A distribution with this p.d.f. is called an inverse gamma distribution.

a. Verify thatξ(θ)is actually a p.d.f. by verifying that

\(\int_0^\infty {\xi \left( \theta \right)} d\theta = 1\)

b. Consider the family of probability distributions that can be represented by a p.d.f.ξ(θ)having the given form for all possible pairs of constantsα >0 andβ >

0. Show that this family is a conjugate family of prior distributions for samples from a normal distribution with a known value of the meanμand an unknown

value of the varianceθ.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.