/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q6E Suppose that X and Y are indepen... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Suppose that X and Y are independent random variables, X has the gamma distribution with parameters α1 and β, and Y has the gamma distribution with parameters α2 and β. Let U = X/ (X + Y) and V = X + Y. Show that (a) U has the beta distribution with parameters α1 and α2, and (b) U and V are independent.

Short Answer

Expert verified

(a) U has the beta distribution with parameters α1 and α2, and

(b) U and V are independent

Step by step solution

01

Given information

X and Y are independent random variables, X has the gamma distribution with parameters α1 and β, and Y has the gamma distribution with parameters α2 and β.

Let U = X/ (X + Y) and V = X + Y, we need to show that

(a) U has the beta distribution with parameters\({\alpha _1}\,and\,{\alpha _2}\), and

(b) U and V are independent

02

Step-2: Proof of part (a)

The marginal distribution probability density function for the random variables X and Y are

\(f\left(x\right) = \frac{{{\beta^{{\alpha _1}}}}}{{\left| \!{\overline {\,{\left( {{\alpha _1}} \right)} \,}} \right. }}{x^{{\alpha _1} - 1}}{e^{ - \beta x}}\)and\(f\left( y \right) = \frac{{{\beta ^{{\alpha _2}}}}}{{\left| \!{\overline {\,{\left({{\alpha_2}}\right)}\,}} \right. }}{y^{{\alpha _2} - 1}}{e^{ - \beta y}}\)

For x>0, y>0, the joint probability

\(\begin{aligned}{}f\left( {x,y} \right)& = \frac{{{\beta ^{{\alpha _1}}}}}{{\left| \!{\overline {\,{\left( {{\alpha _1}} \right)} \,}} \right. }}{x^{{\alpha _1} - 1}}{e^{ - \beta x}} \times \frac{{{\beta ^{{\alpha _2} - 1}}}}{{\left| \!{\overline {\, {\left( {{\alpha _2}} \right)} \,}} \right. }}{y^{{\alpha _2} - 1}}{e^{ - \beta y}}\\& = \frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {\left( {{\alpha _2}} \right)} \,}} \right. }}{x^{{\alpha _1} - 1}}{y^{{\alpha _2} 1}}{e^{- \beta \left( {x + y} \right)}}\end{aligned}\)

Let us consider,

\(\begin{aligned}{}V &= X + Y\\U &= \frac{X}{{X + Y}}\\ \Rightarrow U& = \frac{X}{V}\\X&= UV\\V &= X + Y \Rightarrow Y = V - UV = \left( {1 - V} \right)U\end{aligned}\)

Now, applying the Jacobian transformation

\(\left|J\right|=\left|{\left({\begin{array}{*{20}{c}}{\frac{{dx}}{{du}}}&{\frac{{dx}}{{dv}}}\\{\frac{{dy}}{{du}}}&{\frac{{dy}}{{dv}}}\end{array}} \right)} \right| = \left| {\left( {\begin{array}{*{20}{c}}v&u\\{ - v}&{1 - u}\end{array}} \right)} \right| = v\left( {1 - u} \right) + uv = v\)

Therefore, the joint pdf of U and V is the gamma distribution

\(\begin{aligned}{}f\left[ {uv,\left( {1 - u} \right)v} \right]\left| J \right| &= \frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\,{\left( {{\alpha _2}} \right)} \,}} \right. }}{\left( {uv} \right)^{{\alpha _1} - 1}}{\left( {\left( {1 - u} \right)v} \right)^{{\alpha _2} - 1}}{e^{ - \beta \left( {uv + \left( {1 - u} \right)v} \right)}}\left( v \right)\\&= \left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {{\alpha _2}} \,}} \right. {u^{{\alpha _1} - 1}}{\left( {1 - u} \right)^{{\alpha _2}1}}\frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{v^{{\alpha _1} + {\alpha _2}- 1}}{e^{ - \beta v}}\end{aligned}\)

This probability density function is the product of beta distribution with parameters \({\alpha _1}\,and\,{\alpha _2}\)and the probability density function with parameters \({\alpha _1} + {\alpha _2}\,{\rm{and}}\,\beta \)

Thus, U has a beta distribution \({\alpha _1}\,and\,{\alpha _2}\).

03

Proof of part (b)

From part (a), we get

\(f\left[ {uv,\left( {1 - u} \right)v} \right]\left| J \right| = \left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {{\alpha _2}} \,}} \right. {u^{{\alpha _1} - 1}}{\left( {1 - u} \right)^{{\alpha _2} - 1}}\frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{v^{{\alpha _1} + {\alpha _2}- 1}}{e^{ - \beta v}}\)

Where,

\(\begin{array}{l}f\left( u \right) = \frac{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{{\left| \!{\overline {\, {\left( {{\alpha _1}} \right)} \,}} \right. \left| \!{\overline {\, {\left( {{\alpha _2}} \right)} \,}} \right. }}{u^{{\alpha _1} - 1}}{\left( {1 - u} \right)^{{\alpha _2} - 1}}\,{\rm{and}}\\f\left( v \right) = \frac{{{\beta ^{{\alpha _1} + {\alpha _2}}}}}{{\left| \!{\overline {\, {\left( {{\alpha _1} + {\alpha _2}} \right)} \,}} \right. }}{v^{{\alpha _1} + {\alpha _2} - 1}}{e^{ - \beta v}}\end{array}\)

The joint probability density function is expressed as the pdf of U and pdf of V. Thus, U and V are independent.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Study anywhere. Anytime. Across all devices.