/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q11E Two different professors have ju... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Two different professors have just submitted final exams for duplication. Let \({\rm{X}}\) denote the number of typographical errors on the first professor’s exam and \({\rm{Y}}\) denote the number of such errors on the second exam. Suppose \({\rm{X}}\) has a Poisson distribution with parameter \({{\rm{\mu }}_{\rm{1}}}\), \({\rm{Y}}\) has a Poisson distribution with parameter \({{\rm{\mu }}_{\rm{2}}}\), and \({\rm{X}}\) and \({\rm{Y}}\) are independent.

a. What is the joint pmf of \({\rm{X}}\) and\({\rm{Y}}\)?

b. What is the probability that at most one error is made on both exams combined?

c. Obtain a general expression for the probability that the total number of errors in the two exams is m (where \({\rm{m}}\) is a nonnegative integer). (Hint: \({\rm{A = }}\left\{ {\left( {{\rm{x,y}}} \right){\rm{:x + y = m}}} \right\}{\rm{ = }}\left\{ {\left( {{\rm{m,0}}} \right)\left( {{\rm{m - 1,1}}} \right){\rm{,}}.....{\rm{(1,m - 1),(0,m)}}} \right\}\)Now sum the joint pmf over \({\rm{(x,y)}} \in {\rm{A}}\)and use the binomial theorem, which says that

\({\rm{P(X + Y = m)}}\mathop {\rm{ = }}\limits^{{\rm{(1)}}} {\sum\limits_{{\rm{k = 0}}}^{\rm{m}} {\left( {\begin{array}{*{20}{c}}{\rm{m}}\\{\rm{k}}\end{array}} \right){{\rm{a}}^{\rm{k}}}{{\rm{b}}^{{\rm{m - k}}}}{\rm{ = }}\left( {{\rm{a + b}}} \right)} ^{\rm{m}}}\)

Short Answer

Expert verified

a. \({\rm{p(x,y)}} = \left\{ {\begin{array}{*{20}{l}}{{e^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{x}}{\rm{ \times \mu }}_{\rm{2}}^{\rm{y}}}}{{{\rm{x!y!}}}}}&{,x,y \in {\mathbb{N}_0},}\\0&{,{\rm{otherwise}}.}\end{array}} \right.\)

b. The probability is \({{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\left( {{\rm{1 + }}{{\rm{\mu }}_{\rm{1}}}{\rm{ + }}{{\rm{\mu }}_{\rm{2}}}} \right){\rm{;}}\)

c. The probability is \({{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}{\rm{ \times }}\frac{{\rm{1}}}{{{\rm{m!}}}}{\rm{ \times }}{\left( {{{\rm{\mu }}_{\rm{1}}}{\rm{ + }}{{\rm{\mu }}_{\rm{2}}}} \right)^{\rm{m}}}\).

Step by step solution

01

Definition of Probability

Probability is a metric for determining the possibility of an event occurring. Many things are impossible to forecast with\({\rm{100\% }}\)accuracy. Using it, we can only anticipate the probability of an event occurring, or how probable it is to occur. Probability can range from\({\rm{0}}\)to\({\rm{1}}\), with\({\rm{0}}\)indicating an improbable event and 1 indicating a certain event. Possibility of...

02

Step 2:Find  the joint X and Y?

(a):

If and only if, two random variables \({\rm{X}}\)and \({\rm{Y}}\)are independent.

1. \({\rm{p(x,y) = }}{{\rm{p}}_{\rm{X}}}{\rm{(x) \times }}{{\rm{p}}_{\rm{Y}}}{\rm{(y)}}\),

for every \({\rm{(x,y)}}\)and when \({\rm{X}}\)and \({\rm{Y}}\)discrete rv's,

2. \({\rm{f(x,y) = }}{{\rm{f}}_{\rm{X}}}{\rm{(x) \times }}{{\rm{f}}_{\rm{Y}}}{\rm{(y)}}\),

for every \({\rm{(x,y)}}\)and when \({\rm{X}}\)and \({\rm{Y}}\)continuous rv's,

otherwise they are dependent.

Given the independence of the random variables \({\rm{X}}\)and \({\rm{Y}}\), the joint pmf of \({\rm{X}}\)and \({\rm{Y}}\)is a product of marginal probability mass functions.

Reminder: A random variable \({\rm{X}}\)with pmf

\({\rm{p(x;\mu ) = }}{{\rm{e}}^{{\rm{ - \mu }}}}\frac{{{{\rm{\mu }}^{\rm{x}}}}}{{{\rm{x!}}}}\)

for \({\rm{x = 0,1, \ldots }}\), is said to have Poisson Distribution with parameter \({\rm{\mu > 0}}\).

Therefore, the joint pmf is

\(\begin{aligned}{\rm{p(x,y) = }}{{\rm{p}}_{\rm{X}}}{\rm{(x) \times }}{{\rm{p}}_{\rm{Y}}}{\rm{(y)}}\\{\rm{ = }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{x}}}}{{{\rm{x!}}}}{\rm{ \times }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{y}}}}{{{\rm{y!}}}}\end{aligned}\)

\({\rm{ = }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{x}}{\rm{ \times \mu }}_{\rm{2}}^{\rm{y}}}}{{{\rm{x!y!}}}}\), \({\rm{x,y}} \in {\mathbb{N}_0}\)

\({\rm{p(x,y) = 0}}\), otherwise..

We can write it as

\({\rm{p(x,y)}} = \left\{ {\begin{aligned}{*{20}{l}}{{e^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{x}}{\rm{ \times \mu }}_{\rm{2}}^{\rm{y}}}}{{{\rm{x!y!}}}}}&{,x,y \in {\mathbb{N}_0},}\\0&{,{\rm{otherwise}}.}\end{aligned}} \right.\)

03

 Find the probability that at most one error is made on both exams combined?

(b):

We should look at the sum of the two random variables to see if one or less error will be made on both tests combined. As a result, the following applies:

\(\begin{array}{l}{\rm{P(X + Y}} \le 1){\rm{ = p(0,0) + p(1,0) + p(0,1)}}\\{\rm{ = }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{0}}{\rm{ \times \mu }}_{\rm{2}}^{\rm{0}}}}{{{\rm{0!0!}}}}{\rm{ + }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{1}}{\rm{ \times \mu }}_{\rm{2}}^{\rm{0}}}}{{{\rm{1!0!}}}}{\rm{ + }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\frac{{{\rm{\mu }}_{\rm{1}}^{\rm{0}}{\rm{ \times \mu }}_{\rm{2}}^{\rm{1}}}}{{{\rm{0!1!}}}}\\{\rm{ = }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\left( {{\rm{1 + }}{{\rm{\mu }}_{\rm{1}}}{\rm{ + }}{{\rm{\mu }}_{\rm{2}}}} \right).\end{array}\)

Therefore ,The probability is \({{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}\left( {{\rm{1 + }}{{\rm{\mu }}_{\rm{1}}}{\rm{ + }}{{\rm{\mu }}_{\rm{2}}}} \right){\rm{;}}\)

04

Find the probability that the total number of errors?

(c):

The hint encapsulates all we'll be doing. We look at the total once more, so if \({\rm{m}}\)is a non-negative integer, we have

\(\begin{array}{l}{\rm{P(X + Y = m)}}\mathop {\rm{ = }}\limits^{{\rm{(1)}}} \sum\limits_{{\rm{k = 0}}}^{\rm{m}} {\rm{P}} {\rm{(X = k,Y = m - k)}}\\{\rm{ = }}\sum\limits_{{\rm{k = 0}}}^{\rm{m}} {{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}} \frac{{{\rm{\mu }}_{\rm{1}}^{\rm{k}}{\rm{ \times \mu }}_{\rm{2}}^{{\rm{m - k}}}}}{{{\rm{k!(m - k)!}}}}\\\mathop {\rm{ = }}\limits^{{\rm{(2)}}} {{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}{\rm{ \times }}\frac{{{\rm{m!}}}}{{{\rm{m!}}}}\sum\limits_{{\rm{k = 0}}}^{\rm{m}} {{\rm{\mu }}_{\rm{1}}^{\rm{k}}} {\rm{ \times \mu }}_{\rm{2}}^{{\rm{m - k}}}{\rm{ \times }}\frac{{\rm{1}}}{{{\rm{k!(m - k)!}}}}\\{\rm{ = }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}{\rm{ \times }}\frac{{\rm{1}}}{{{\rm{m!}}}}\sum\limits_{{\rm{k = 0}}}^{\rm{m}} {{\rm{\mu }}_{\rm{1}}^{\rm{k}}} {\rm{ \times \mu }}_{\rm{2}}^{{\rm{m - k}}}{\rm{ \times }}\frac{{{\rm{m!}}}}{{{\rm{k!(m - k)!}}}}\\{\rm{ = }}{{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}{\rm{ \times }}\frac{{\rm{1}}}{{{\rm{m!}}}}\sum\limits_{{\rm{k = 0}}}^{\rm{m}} {{\rm{\mu }}_{\rm{1}}^{\rm{k}}} {\rm{ \times \mu }}_{\rm{2}}^{{\rm{m - k}}}{\rm{ \times }}\left( {\begin{array}{*{20}{c}}{\rm{m}}\\{\rm{k}}\end{array}} \right)\\\mathop {\rm{ = }}\limits^{{\rm{(3)}}} {{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}{\rm{ \times }}\frac{{\rm{1}}}{{{\rm{m!}}}}{\rm{ \times }}{\left( {{{\rm{\mu }}_{\rm{1}}}{\rm{ + }}{{\rm{\mu }}_{\rm{2}}}} \right)^{\rm{m}}}{\rm{,}}\end{array}\)

(1) : look at the given set \({\rm{A}}\). We can write event \({\rm{X + Y = m}}\)as union of \({\rm{m + 1}}\)disjoint events, therefore we get the sum.

(2) : we multiply with

\({\rm{1 = }}\frac{{{\rm{m!}}}}{{{\rm{m!}}}}\)

in order to obtain

\(\left( {\begin{array}{*{20}{c}}{\rm{m}}\\{\rm{k}}\end{array}} \right)\)

(3) : the Binomial Theorem (see the exercise).

The expression we received represents the pmf of a Poisson random variable with parameter \({{\rm{\mu }}_{\rm{1}}}{\rm{ + }}{{\rm{\mu }}_{\rm{2}}}\), as we can see from the definition of Poisson distribution. The total number of errors in the two exams follows the Poisson distribution described before.

Therefore,The probability is \({{\rm{e}}^{{\rm{ - }}{{\rm{\mu }}_{\rm{1}}}{\rm{ - }}{{\rm{\mu }}_{\rm{2}}}}}{\rm{ \times }}\frac{{\rm{1}}}{{{\rm{m!}}}}{\rm{ \times }}{\left( {{{\rm{\mu }}_{\rm{1}}}{\rm{ + }}{{\rm{\mu }}_{\rm{2}}}} \right)^{\rm{m}}}\).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Suppose the amount of liquid dispensed by a certain machine is uniformly distributed with lower limit \({\rm{A = 8oz}}\) and upper limit\({\rm{B = 10oz}}\). Describe how you would carry out simulation experiments to compare the sampling distribution of the (sample) fourth spread for sample sizes\({\rm{n = 5,10,20}}\), and\({\rm{30}}\).

A rock specimen from a particular area is randomly selected and weighed two different times. Let \({\rm{W}}\)denote the actual weight and \({{\rm{X}}_{\rm{1}}}\)and \({{\rm{X}}_{\rm{2}}}\)the two measured weights. Then \({{\rm{X}}_{\rm{1}}}{\rm{ = W + }}{{\rm{E}}_{\rm{1}}}\)and\({{\rm{X}}_{\rm{2}}}{\rm{ = W + }}{{\rm{E}}_{\rm{2}}}\), where \({{\rm{E}}_{\rm{1}}}\)and \({{\rm{E}}_{\rm{2}}}\)are the two measurement errors. Suppose that the \({{\rm{E}}_{\rm{i}}}\) 's are independent of one another and of \({\rm{W}}\)and that\({\rm{V}}\left( {{{\rm{E}}_{\rm{1}}}} \right){\rm{ = V}}\left( {{{\rm{E}}_{\rm{2}}}} \right){\rm{ = \sigma }}_{{{\rm{E}}^{\rm{2}}}}^{\rm{2}}\).

a. Express\({\rm{\rho }}\), the correlation coefficient between the two measured weights \({{\rm{X}}_{\rm{1}}}\)and\({{\rm{X}}_{\rm{2}}}\), in terms of\({\rm{\sigma }}_{\rm{X}}^{\rm{2}}\), the variance of actual weight, and\({\rm{\sigma }}_{\rm{X}}^{\rm{2}}\), the variance of measured weight.

b. Compute \({\rm{\rho }}\)when \({{\rm{\sigma }}_{\rm{W}}}{\rm{ = 1\;kg}}\) and \({{\rm{\sigma }}_{\rm{E}}}{\rm{ = }}{\rm{.01\;kg}}{\rm{.}}\)

Show that if, then\({\rm{Corr(X,Y) = + 1}}\) or\({\rm{ - 1}}\). Under what conditions will\({\rm{\rho = + 1}}\)?

Carry out a simulation experiment using a statistical computer package or other software to study the sampling distribution of \({\rm{\bar X}}\) when the population distribution is Weibull with \({\rm{\alpha = 2}}\) and\({\rm{\beta = 5}}\), as in Example\({\rm{5}}{\rm{.20}}\).[A1] Consider the four sample sizes, and\({\rm{30}}\), and in each case use \({\rm{1000}}\) replications. For which of these sample sizes does the \({\rm{\bar X}}\) sampling distribution appear to be approximately normal?

An instructor has given a short quiz consisting of two parts. For a randomly selected student, let \({\rm{X = }}\) the number of points earned on the first part and \({\rm{Y = }}\) the number of points earned on the second part. Suppose that the joint pmf of \({\rm{X}}\) and \({\rm{Y}}\) is given in the accompanying table.

a. If the score recorded in the grade book is the total number of points earned on the two parts, what is the expected recorded score\({\rm{E(X + Y)}}\)?

b. If the maximum of the two scores is recorded, what is the expected recorded score?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.