/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q63E Refer to Exercise.a. Calculate t... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

Refer to Exercise.

a. Calculate the covariance between \({X_1} = \)the number of customers in the express checkout and\({X_2} = \)the number of customers in the superexpress checkout.

b. Calculate\(V\left( {{X_1} + {X_2}} \right)\). How does this compare to\(V\left( {{X_1}} \right) + V\left( {{X_2}} \right)\)?

Short Answer

Expert verified

\(\begin{array}{l}{\rm{ a}}{\rm{. }}{\mathop{\rm Cov}\nolimits} \left( {{X_1},{X_2}} \right) = 0.695{\rm{;}}\\{\rm{ b}}{\rm{. }}V\left( {{X_1} + {X_2}} \right) = 4.0675.{\rm{ }}\end{array}\)

Step by step solution

01

Definition of Covariance

Covariance is a measure of the joint variability of two random variables in probability theory and statistics. The covariance is positive if the bigger values of one variable largely correlate to the greater values of the other variable, and vice versa for the lesser values (that is, the variables tend to behave similarly).

02

Calculation for the determination of covariance

(a):

Proposition: The following holds

\({\mathop{\rm Cov}\nolimits} (X,Y) = E(XY) - E(X) \cdot E(Y).\)

The joint pmf of random variable \({X_1}\)and \({X_2}\)is given in the mentioned exercise. The marginal pmf of \({X_1}\)is

The Expected Value (mean value) of a discrete random variable X with set of possible values S and pmf of P(X) is

\(E(X) = {\mu _X} = \sum\limits_{x \in S} x \cdot p(x).\)

Therefore, the expected value is

\(\begin{aligned}E\left( {{X_1}} \right) &= 0 \cdot {p_{{X_1}}}(0) + 1 \cdot {p_{{X_1}}}(1) + 2 \cdot {p_{{X_1}}}(2) + 3 \cdot {p_{{X_1}}}(3) + 4 \cdot {p_{{X_1}}}(4)\\ &= 0 \cdot 0.19 + 1 \cdot 0.3 + 2 \cdot 0.25 + 3 \cdot 0.14 + 4 \cdot 0.12\\ &= 0 + 0.3 + 0.5 + 0.42 + 0.48\\ &= 1.7\end{aligned}\)

03

Calculation for the determination of covariance

The marginal pmf of \({X_2}\)is

The expected value is

\(\begin{array}{c}E\left( {{X_2}} \right) = 0 \cdot 0.19 + 1 \cdot 0.3 + 2 \cdot 0.28 + 3 \cdot 0.23\\ = 1.55\end{array}\)

The following is expected values of random variable \({X_1}{X_2}\)

\(\begin{aligned}E\left( {{X_1}{X_2}} \right) &= \sum\limits_{{x_1}} {\sum\limits_{{x_2}} {{x_1}} } {x_2}p\left( {{x_1},{x_2}} \right)\\ &= 0 \cdot 0 \cdot 0.08 + 0 \cdot 1 \cdot 0.07 + \ldots + 4 \cdot 2 \cdot 0.05 + 4 \cdot 3 \cdot 0.06\\ &= 3.33\end{aligned}\)

The covariance can be computed as

\(\begin{aligned}{\mathop{\rm Cov}\nolimits} \left( {{X_1},{X_2}} \right) &= E\left( {{X_1}{X_2}} \right) - E\left( {{X_1}} \right)E\left( {{X_2}} \right)\\ &= 3.33 - 2.635\\ &= 0.695.\end{aligned}\)

04

Calculation for the determination of variance.

(b):

Two random variables X and Y are independent if and only if

1.\(p(x,y) = {p_X}(x) \cdot {p_Y}(y)\),

for every (x, y) and when X and Y discrete rv's,

2.\(f(x,y) = {f_X}(x) \cdot {f_Y}(y)\),

for every (x, y) and when X and Y continuous rv's,

Otherwise, they are dependent.

By calculating probabilities

\(\begin{array}{l}P\left( {{X_1} = 4} \right) = {p_{{X_1}}}(4) = 0.12,\\P\left( {{X_2} = 0} \right) = {p_{{X_2}}}(0) = 0.19,\end{array}\)

and probability

\(P\left( {{X_1} = 4,{X_2} = 0} \right) = p(4,0) = 0\)

05

Calculation for the determination of variance

Notice that

\(p(4,0) = 0 \ne 0.0228 = 0.12 \cdot 0.19 = {p_{{X_1}}}(4) \cdot {p_{{X_2}}}(0),\)

from which we can conclude that the random variable is dependent.

Because the random variables are dependent, equality

\(V\left( {{X_1} + {X_2}} \right) = V\left( {{X_1}} \right) + V\left( {{X_2}} \right)\)

does not stand! Instead, use the following equality to compute the variance

\(V\left( {{X_1} + {X_2}} \right) = V\left( {{X_1}} \right) + V\left( {{X_2}} \right) + 2{\mathop{\rm Cov}\nolimits} \left( {{X_1},{X_2}} \right).\)

06

Calculation for the determination of variance

First compute variances as follows

\(\begin{aligned}V\left( {{X_1}} \right) &= E\left( {X_1^2} \right) - \left( {E\left( {X_1^2} \right)} \right)\\ &= {0^2} \cdot 0.19 + {1^2} \cdot 0.3 + {2^2} \cdot 0.25 + {3^2} \cdot 0.14 + {4^2} \cdot 0.12 - {1.7^2}\\ &= 1.59\end{aligned}\)

and for random variable \({X_2}\)

\(\begin{aligned}V\left( {{X_2}} \right) &= E\left( {X_2^2} \right) - \left( {E\left( {X_2^2} \right)} \right)\\ &= {0^2} \cdot 0.19 + {1^2} \cdot 0.3 + {2^2} \cdot 0.28 + {3^2} \cdot 0.23 - {1.55^2}\\ &= 1.0875\end{aligned}\)

Finally, the following holds

\(\begin{aligned}V\left( {{X_1} + {X_2}} \right) &= V\left( {{X_1}} \right) + V\left( {{X_2}} \right) + 2{\mathop{\rm Cov}\nolimits} \left( {{X_1},{X_2}} \right)\\ &= 1.59 + 1.0875 + 2 \cdot 0.695\\ &= 4.0675\end{aligned}\)

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Question: The number of customers waiting for gift-wrap service at a department store is an rv X with possible values \({\rm{0,1,2,3,4}}\)and corresponding probabilities \({\rm{.1,}}{\rm{.2,}}{\rm{.3,}}{\rm{.25,}}{\rm{.15}}{\rm{.}}\)A randomly selected customer will have \({\rm{1,2}}\),or \({\rm{3}}\) packages for wrapping with probabilities \({\rm{.6,}}{\rm{.3,}}\)and \({\rm{.1,}}\)respectively. Let \({\rm{Y = }}\)the total number of packages to be wrapped for the customers waiting in line (assume that the number of packages submitted by one customer is independent of the number submitted by any other customer).

a. Determine \({\rm{P(X = 3,Y = 3)}}\), i.e., \({\rm{P(3,3)}}\).

b. Determine \({\rm{p(4,11)}}\).

Suppose that for a certain individual, calorie intake at breakfast is a random variable with an expected value of \({\bf{500}}\)and standard deviation of \({\bf{50}}\), calorie intake at lunch is random with an expected value of \({\bf{900}}\) and standard deviation of\(100\), and calorie intake at dinner is a random variable with expected value \({\bf{2000}}\)and standard deviation\({\bf{180}}\). Assuming that intakes at different meals are independent of one another, what is the probability that the average calorie intake per day over the next (\({\bf{365}}\)- day) year is at most\({\bf{3500}}\)?

Exercise introduced random variables X and Y, the number of cars and buses, respectively, carried by ferry on a single trip. The joint pmf of X and Y is given in the table in Exercise. It is readily verified that X and Y are independent.

a. Compute the expected value, variance, and standard deviation of the total number of vehicles on a single trip.

b. If each car is charged\(\$ {\bf{3}}\)and each bus\(\$ {\bf{10}}\), compute the expected value, variance, and standard deviation of the revenue resulting from a single trip.

a. Use the rules of expected value to show that \({\rm{Cov(aX + b,cY + d) = acCov(X,Y)}}{\rm{.}}\)

b. Use part (a) along with the rules of variance and standard deviation to show that \({\rm{Corr(aX + b,cY + d) = Corr(X,Y)}}\) when \({\rm{a}}\) and \({\rm{c}}\) have the same sign.

c. What happens if \({\rm{a}}\) and \({\rm{c}}\) have opposite signs?

a. Let \({{\rm{X}}_{\rm{1}}}\)have a chi-squared distribution with parameter \({{\rm{\nu }}_{\rm{1}}}\) (see Section 4.4), and let \({{\rm{X}}_{\rm{2}}}\)be independent of \({{\rm{X}}_{\rm{1}}}\)and have a chi-squared distribution with parameter\({{\rm{v}}_{\rm{2}}}\). Use the technique of to show that \({{\rm{X}}_{\rm{1}}}{\rm{ + }}{{\rm{X}}_{\rm{2}}}\)has a chi-squared distribution with parameter\({{\rm{v}}_{\rm{1}}}{\rm{ + }}{{\rm{v}}_{\rm{2}}}\).

b. You were asked to show that if \({\rm{Z}}\)is a standard normal \({\rm{rv}}\), then \({{\rm{Z}}^{\rm{2}}}\)has a chi squared distribution with\({\rm{v = 1}}\). Let \({{\rm{Z}}_{\rm{1}}}{\rm{,}}{{\rm{Z}}_{\rm{2}}}{\rm{, \ldots ,}}{{\rm{Z}}_{\rm{n}}}\)be \({\rm{n}}\)independent standard normal \({\rm{rv}}\) 's. What is the distribution of\({\rm{Z}}_{\rm{1}}^{\rm{2}}{\rm{ + }}...{\rm{ + Z}}_{\rm{n}}^{\rm{2}}\)? Justify your answer.

c. Let \({{\rm{X}}_{\rm{1}}}{\rm{, \ldots ,}}{{\rm{X}}_{\rm{n}}}\)be a random sample from a normal distribution with mean \({\rm{\mu }}\)and variance\({{\rm{\sigma }}^{\rm{2}}}\). What is the distribution of the sum \({\rm{Y = }}\sum\limits_{{\rm{i = 1}}}^{\rm{n}} {{{\left( {\left( {{{\rm{X}}_{\rm{i}}}{\rm{ - \mu }}} \right){\rm{/\sigma }}} \right)}^{\rm{2}}}} {\rm{?}}\)Justify your answer?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.