/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q8SE Let \({\bf{f}}\left( {{{\bf{x}}_... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Let \({\bf{f}}\left( {{{\bf{x}}_{\bf{1}}}{\bf{,}}{{\bf{x}}_{\bf{2}}}} \right)\) denote the p.d.f. of the bivariate normaldistribution specified by Eq. (5.10.2). Show that the maximumvalue of \({\bf{f}}\left( {{{\bf{x}}_{\bf{1}}}{\bf{,}}{{\bf{x}}_{\bf{2}}}} \right)\) is attained at the point at which \({{\bf{x}}_{\bf{1}}} = {{\bf{\mu }}_{\bf{1}}}{\rm{ }}{\bf{and}}{\rm{ }}{{\bf{x}}_{\bf{2}}} = {{\bf{\mu }}_{\bf{2}}}.\)

Short Answer

Expert verified

The maximum value of \(f\left( {{x_1},{x_2}} \right)\) is attained at the point at which \({x_1} = {\mu _1}{\rm{ }}and{\rm{ }}{x_2} = {\mu _2}.\)

Step by step solution

01

Given information

The given joint pdf of \({X_1}\,\,and\,\,{X_2}\) is;

\(f\left( {{x_1},{x_2}} \right) = \frac{1}{{2\pi {{\left( {1 - p} \right)}^{\frac{1}{2}}}{\sigma _1}{\sigma _2}}}\exp \left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left[ {{{\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)}^2} + {{\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)}^2} - 2p\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right]} \right\}\)

02

Differentiating the pdf with respect to \({{\bf{X}}_{\bf{1}}}\)

To find the maximum value of \(f\left( {{x_1},{x_2}} \right)\)at point \({X_1}\,\,\), one need to differentiate the pdf with respect to \({X_1}\,\,\)

\(\begin{aligned}{}\frac{\partial }{{\partial {x_1}}}\ln f\left( {{x_1},{x_2}} \right) &= \frac{\partial }{{\partial {x_1}}}\ln \left[ {\frac{1}{{2\pi {{\left( {1 - p} \right)}^{\frac{1}{2}}}{\sigma _1}{\sigma _2}}}\exp \left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left[ {{{\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)}^2} + {{\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)}^2} - 2p\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right]} \right\}} \right]\\ &= \frac{\partial }{{\partial {x_1}}}\left( {\ln k + \left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left[ {{{\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)}^2} + {{\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)}^2} - 2p\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right]} \right\}} \right)\,\,\,\left( {{\rm{k}}\,\,{\rm{is}}\,\,{\rm{constant}}\,\,{\rm{term}}} \right)\\ &= 0 + \left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left( {2\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)} \right)} \right\}\end{aligned}\)

Equating it with 0,

\(\begin{aligned}{}\left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left( {2\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)} \right)} \right\} &= 0\\2\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right) &= 0\\\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right) &= 0\\{x_1} = {\mu _1}\end{aligned}\)

Therefore, at \({x_1} = {\mu _1}\) the maximum is achieved.

03

Differentiating the pdf with respect to \({{\bf{X}}_{\bf{2}}}\)

To find the maximum value of \(f\left( {{x_1},{x_2}} \right)\)at point \({X_2}\), One need to differentiate the pdf with respect to \({X_2}\)

\(\begin{aligned}{}\frac{\partial }{{\partial {x_2}}}\ln f\left( {{x_1},{x_2}} \right) &= \frac{\partial }{{\partial {x_2}}}\ln \left[ {\frac{1}{{2\pi {{\left( {1 - p} \right)}^{\frac{1}{2}}}{\sigma _1}{\sigma _2}}}\exp \left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left[ {{{\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)}^2} + {{\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)}^2} - 2p\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right]} \right\}} \right]\\& = \frac{\partial }{{\partial {x_2}}}\left( {\ln k + \left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left[ {{{\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)}^2} + {{\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)}^2} - 2p\left( {\frac{{{x_1} - {\mu _1}}}{{{\sigma _1}}}} \right)\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right]} \right\}} \right)\,\,\,\left( {{\rm{k}}\,\,{\rm{is}}\,\,{\rm{constant}}\,\,{\rm{term}}} \right)\\ &= 0 + \left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left( {2\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right)} \right\}\end{aligned}\)

Equating it with 0,

\(\begin{aligned}{}\left\{ {\frac{{ - 1}}{{2\left( {1 - {p^2}} \right)}}\left( {2\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right)} \right)} \right\} &= 0\\2\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right) &= 0\\\left( {\frac{{{x_2} - {\mu _2}}}{{{\sigma _2}}}} \right) &= 0\\{x_2} = {\mu _2}\end{aligned}\)

Therefore, at \({x_2} = {\mu _2}\) the maximum is achieved

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

If the m.g.f. of a random variable X is\(\psi \left( t \right) = {e^{{t^2}}}\,for - \infty < t < \infty \)What is the distribution of X?

It is said that a random variableX has an increasing failure rate if the failure rate h(x) defined in Exercise 18 is an increasing function of xfor x> 0, and it is said that Xhas a decreasing failure rate if h(x) is a decreasing function of x for x > 0. Suppose that X has the Weibull distribution with parameters a and b, as defined in Exercise 19. Show thatX has an increasing failure rate if b > 1, and X has a decreasing failure rate if b < 1.

Suppose that a fair coin is tossed until at least one head and at least one tail has been obtained. Let X denote the number of tosses that are required. Find the p.f. of X

Suppose that events occur in accordance with a Poisson process at the rate of five events per hour.

a. Determine the distribution of the waiting time \({{\bf{T}}_{\bf{1}}}\) until the first event occurs.

b. Determine the distribution of the total waiting time \({{\bf{T}}_{\bf{k}}}\) until k events have occurred.

c. Determine the probability that none of the first k events will occur within 20 minutes of one another.

Suppose that a die is loaded so that each of the numbers 1, 2, 3, 4, 5, and 6 has a different probability of appearing when the die is rolled. For\({\bf{i = 1, \ldots ,6,}}\)let\({{\bf{p}}_{\bf{i}}}\)denote the probability that the number i will be obtained, and

suppose that\({{\bf{p}}_{\bf{1}}}{\bf{ = 0}}{\bf{.11,}}\,\,{{\bf{p}}_{\bf{2}}}{\bf{ = 0}}{\bf{.30,}}\,\,{{\bf{p}}_{\bf{3}}}{\bf{ = 0}}{\bf{.22,}}\,\,{{\bf{p}}_{\bf{4}}}{\bf{ = 0}}{\bf{.05,}}\,\,{{\bf{p}}_{\bf{5}}}{\bf{ = 0}}{\bf{.25}}\,\,{\bf{and}}\,{{\bf{p}}_{\bf{6}}}{\bf{ = 0}}{\bf{.07}}{\bf{.}}\)Suppose also that the die is to be rolled 40 times. Let\({{\bf{X}}_{\bf{1}}}\)denote the number of rolls for which an even number appears, and let\({{\bf{X}}_2}\)denote the number of rolls for which either the number 1 or the number 3 appears. Find the value of\({\bf{Pr}}\left( {{{\bf{X}}_{\bf{1}}}{\bf{ = 20}}\,\,{\bf{and}}\,\,{{\bf{X}}_{\bf{2}}}{\bf{ = 15}}} \right)\).

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.