/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q1E Prove Corollary 5.9.2.... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Prove Corollary 5.9.2.

Short Answer

Expert verified

The proof has been established.

Step by step solution

01

Given information

Corollary 5.9.2 states that Suppose that the random vector \(X = \left( {{X_1} \ldots {X_k}} \right)\) has the multinomial distribution with parameters \(n\) and \(p = \left( {{p_1} \ldots {p_k}} \right)\) with \(k > 2.\) Let \(l < k\), and let \({i_1} \ldots {i_l}\)be distinct elements of the set \(\left\{ {1, \ldots ,k} \right\}\)The distribution of \(Y = {X_{{i_1}}} + \ldots + {X_{{i_l}}}\)

is the binomial distribution with parameters \(n\)and \({p_{{i_1}}} + \ldots + {p_{{i_l}}}\)

02

Proof

Since \(X = \left( {{X_1} \ldots {X_k}} \right)\) it follows multinomial distribution, therefore the joint m.g.f. of \({X_1} \ldots {X_k}\)is

\(\begin{aligned}{}{M_{{X_1} \ldots {X_k}}}\left( {{t_1} \ldots {t_k}} \right) &= E\left( {{e^{{t_1}{X_1} + \ldots + {t_k}{X_k}}}} \right)\\ &= \sum\limits_x {\left[ {\frac{{n!}}{{{X_1}! \ldots {X_k}!}}p_1^{{x_1}} \ldots p_k^{{x_k}} \times {e^{{t_1}{X_1} + \ldots + {t_k}{X_k}}}} \right]} \\& = \sum\limits_x {\left[ {\frac{{n!}}{{{X_1}! \ldots {X_k}!}}{{\left( {{p_1}{e^{{t_1}}}} \right)}^{{x_1}}} \ldots {{\left( {{p_k}{e^{{t_k}}}} \right)}^{{x_k}}}} \right]} \\& = {\left( {{p_1}{e^{{t_1}}} + \ldots + {p_k}{e^{{t_k}}}} \right)^n}\end{aligned}\)

Therefore, the marginal distribution of \({X_i}\)is

\(\begin{aligned}{}{M_{{X_1}}}\left( t \right)& = {M_X}\left( {{t_1},0, \ldots 0} \right)\\ &= {\left( {{p_1}{e^{{t_1}}} + {p_2} + \ldots + {p_k}} \right)^n}\\ &= {\left[ {\left( {1 - {p_1}} \right) + {p_1}{e^{{t_1}}}} \right]^n}\\ \Rightarrow {X_1} \sim B\left( {n,{p_1}} \right)\\Similarly,\\ \Rightarrow {X_i} \sim B\left( {n,{p_i}} \right),\,\,i = 1,2 \ldots ,k\end{aligned}\)

It is given that \(k > 2\)and\(l < k\). Therefore, \(l = 1.\). This implies the parameter p is equal throughout all the individual binomial variates.

It is given that

\(Y = {X_{{i_1}}} + \ldots + {X_{{i_l}}}\)

\(\begin{aligned}{}{M_Y}\left( t \right)&= {M_{{X_{{i_1}}} + \ldots + {X_{{i_l}}}}}\left( t \right)\\ &= {M_{{X_{{i_1}}}}}\left( t \right) \ldots {M_{{X_{{i_l}}}}}\left( t \right)\\& = {\left( {q + p{e^t}} \right)^{{n_{_{{i_1}}}}}} \ldots {\left( {q + p{e^t}} \right)^{{n_{{i_l}}}}}\\& = {\left( {q + p{e^t}} \right)^n},\,\,where\,\,\sum\limits_{j = 1}^l {{n_{_{{i_j}}}} = n} \end{aligned}\)

We know that some of the Binomial variates are also a binomial variables with the same parameter \(p = {p_{{i_1}}} + \ldots + {p_{{i_l}}}\)

Therefore, \(Y \sim B\left( {n,p = {p_{{i_1}}} + \ldots + {p_{{i_l}}}} \right)\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose that the random variables\({X_1},...,{X_k}\)are independent and\({X_i}\)has the exponential distribution with parameter\({\beta _i}\left( {i = 1,...,n} \right)\). Let\(Y = \min \left\{ {{X_{1,...,}}{X_k}} \right\}\)Show that Y has the exponential distribution with parameter\({\beta _1} + .... + {\beta _k}\).

Consider again the electronic system described in Exercise 10, but suppose now that the system will continue to operate until two components have failed. Determine the mean and the variance of the length of time until the system fails.

Suppose that n items are being tested simultaneously, the items are independent, and the length of life of each item has the exponential distribution with parameter\(\beta \).Determine the expected length of time until three items have failed. Hint: The required value is\(E\left( {{Y_1} + {Y_2} + {Y_3}} \right)\)in the notation of Theorem 5.7.11.

Suppose that X and Y are independent Poisson random variables such that \({\bf{Var}}\left( {\bf{X}} \right){\bf{ + Var}}\left( {\bf{Y}} \right){\bf{ = 5}}\). Evaluate \({\bf{Pr}}\left( {{\bf{X + Y < 2}}} \right)\).

Suppose that an electronic system contains n components that function independently of each other, and suppose that these components are connected in series, as defined in Exercise 5 of Sec. 3.7. Suppose also that each component will function properly for a certain number of periods and then will fail. Finally, suppose that for i =1,...,n, the number of periods for which component i will function properly is a discrete random variable having a geometric distribution with parameter \({p_i}\). Determine the distribution of the number of periods for which the system will function properly.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.