/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Q64E Show that E(X) = np when X is a ... [FREE SOLUTION] | 91影视

91影视

Show that E(X) = np when X is a binomial random variable. (Hint: First express E(X) as a sum with lower limit x =\({\rm{1}}\). Then factor out np, let y = x\({\rm{ - 1}}\)so that the sum is from y = 0 to y = n\({\rm{ - 1}}\), and show that the sum equals\({\rm{1}}\).)

Short Answer

Expert verified

It is proven that: \({\rm{E}}\left( {\rm{X}} \right){\rm{ = np}}\).

Step by step solution

01

Define Discrete random variables

A discrete random variable is one that can only take on a finite number of different values.

02

Show that E(X) = np

There are several ways to prove E(X) = np, but possibly the simplest is to use an indicator random variable. We won't utilise it, but I invite you to locate the proof and examine it.

We may deduce the following from the exercise's hint:

\(\begin{array}{c}{\rm{E(X) }}\mathop {\rm{ = }}\limits^{{\rm{(1)}}} \sum\limits_{{\rm{x = 0}}}^{\rm{n}} {\rm{x}} {\rm{ \times }}\left( {\begin{array}{*{20}{l}}{\rm{n}}\\{\rm{x}}\end{array}} \right){{\rm{p}}^{\rm{x}}}{{\rm{(1 - p)}}^{{\rm{n - x}}}}\\\mathop {\rm{ = }}\limits^{{\rm{(2)}}} \sum\limits_{{\rm{x = 1}}}^{\rm{n}} {\rm{x}} {\rm{ \times }}\left( {\begin{array}{*{20}{l}}{\rm{n}}\\{\rm{x}}\end{array}} \right){{\rm{p}}^{\rm{x}}}{{\rm{(1 - p)}}^{{\rm{n - x}}}}\\\mathop {\rm{ = }}\limits^{{\rm{(3)}}} \sum\limits_{{\rm{x = 1}}}^{\rm{n}} {\rm{x}} {\rm{ \times }}\frac{{{\rm{n!}}}}{{{\rm{x!(n - x)!}}}}{{\rm{p}}^{\rm{x}}}{{\rm{(1 - p)}}^{{\rm{n - x}}}}\\{\rm{ = }}\sum\limits_{{\rm{x = 1}}}^{\rm{n}} {\frac{{{\rm{n!}}}}{{{\rm{(x - 1)!(n - x)!}}}}} {{\rm{p}}^{\rm{x}}}{{\rm{(1 - p)}}^{{\rm{n - x}}}}\\\mathop {\rm{ = }}\limits^{{\rm{(4)}}} {\rm{np}}\sum\limits_{{\rm{x = 1}}}^{\rm{n}} {\frac{{{\rm{(n - 1)!}}}}{{{\rm{(x - 1)!(n - x)!}}}}} {{\rm{p}}^{{\rm{x - 1}}}}{{\rm{(1 - p)}}^{{\rm{n - x}}}}\\\mathop {\rm{ = }}\limits^{{\rm{(5)}}} {\rm{np}}\sum\limits_{{\rm{y = 0}}}^{{\rm{n - 1}}} {\frac{{{\rm{(n - 1)!}}}}{{{\rm{y!(n - y - 1)!}}}}} {{\rm{p}}^{\rm{y}}}{{\rm{(1 - p)}}^{{\rm{n - y - 1}}}}\\\mathop {\rm{ = }}\limits^{{\rm{(6)}}} {\rm{np}}\left( {\sum\limits_{{\rm{y = 0}}}^{{\rm{n - 1}}} {\left( {\begin{array}{*{20}{c}}{{\rm{n - 1}}}\\{\rm{y}}\end{array}} \right)} {{\rm{p}}^{\rm{y}}}{{{\rm{(1 - p)}}}^{{\rm{n - y - 1}}}}} \right)\\\mathop {\rm{ = }}\limits^{{\rm{(7)}}} {\rm{np \times 1}}\\{\rm{ = np;}}\end{array}\)

(1): as defined in the following definition of anticipated value;

(2): because the product is zero for x = 0, we may eliminate it;

(3): we take use of the fact that

\(\left( {\begin{array}{*{20}{l}}{\rm{n}}\\{\rm{x}}\end{array}} \right){\rm{ = }}\frac{{{\rm{n!}}}}{{{\rm{x!(n - x)!}}}}\)

(4): we pick n from n!=n(\({\rm{n - 1}}\))and x from p = p.\({\rm{p - 1}}\);

(5): y = \({\rm{x - 1}}\) implies x = \({\rm{y + 1}}\); just replace;

(6): we take use of the fact that:

\(\left( {\begin{array}{*{20}{c}}{{\rm{n - 1}}}\\{\rm{y}}\end{array}} \right){\rm{ = }}\frac{{{\rm{(n - 1)!}}}}{{{\rm{y!(n - y - 1)!}}}}\)

(7): Because the values are from a binomial distribution based on \({\rm{n - 1}}\) trials, the total reflects the sum over all y = 0, \({\rm{1}}\),...,\({\rm{n - 1}}\), which must be 1, we may state that

\({\rm{Y}} \sim {\rm{Bin(n - 1,p)}}\)

As a result, the total must be \({\rm{1}}\) (for any discrete distribution, the sum of all potential distribution pmf values must equal \({\rm{1}}\)).

A discrete random variable X with a set of potential values S and pmf p(x) has an Expected Value (mean value).

\(\begin{array}{c}{\rm{E(X) = }}{{\rm{\mu }}_{\rm{X}}}\\{\rm{ = }}\sum\limits_{{\rm{x}} \in {\rm{S}}} {{\rm{x \bullet p(x)}}} \end{array}\)

Therefore, it is proved that: E(X) = np.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91影视!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Automobiles arrive at a vehicle equipment inspection station according to a Poisson process with rate \(\alpha = 10\)per hour. Suppose that with probability \(.{\bf{5}}\) an arriving vehicle will have no equipment violations. a. What is the probability that exactly ten arrive during the hour and all ten have no violations? b. For any fixed \(y \ge 10\), what is the probability that y arrives during the hour, of which ten have no violations? c. What is the probability that ten 鈥渘o-violation鈥 cars arrive during the next hour?

Let X have a Poisson distribution with parameter \({\rm{\mu }}\). Show that E(X) =\({\rm{\mu }}\) directly from the definition of expected value. (Hint: The first term in the sum equals \({\rm{0}}\), and then x can be cancelled. Now factor out \({\rm{\mu }}\) and show that what is left sums to \({\rm{1}}\).)

The purchaser of a power-generating unit requires \({\rm{c}}\) consecutive successful start-ups before the unit will be accepted. Assume that the outcomes of individual start-ups are independent of one another. Let \({\rm{p}}\) denote the probability that any particular start-up is successful. The random variable of interest is \({\rm{X = }}\)the number of start-ups that must be made prior to acceptance. Give the \({\rm{pmf}}\) of \({\rm{X}}\) for the case \({\rm{c = 2}}\). If \({\rm{p = }}{\rm{.9}}\), what is \({\rm{P(X}} \le {\rm{8)}}\)? (Hint: For \({\rm{x}} \ge {\rm{5}}\), express \({\rm{p(x)}}\) 鈥渞ecursively鈥 in terms of the \({\rm{pmf}}\) evaluated at the smaller values \({\rm{x - 3,x - 4,}}...{\rm{,2}}{\rm{.}}\)) (This problem was suggested by the article 鈥淓valuation of a Start-Up Demonstration Test,鈥 J. Quality Technology, \({\rm{1983: 103 - 106}}\).)

A computer disk storage device has ten concentric tracks, numbered \({\rm{1,2,}}...{\rm{,10}}\) from outermost to innermost, and a single access arm. Let \({{\rm{p}}_{\rm{i}}}{\rm{ = }}\)the probability that any particular request for data will take the arm to track \({\rm{i(i = 1,}}....{\rm{,10)}}\). Assume that the tracks accessed in successive seeks are independent. Let \({\rm{X = }}\)the number of tracks over which the access arm passes during two successive requests (excluding the track that the arm has just left, so possible \({\rm{X}}\) values are \({\rm{x = 0,1,}}...{\rm{,9}}\)). Compute the \({\rm{pmf}}\) of \({\rm{X}}\). (Hint: \({\rm{P}}\) (the arm is now on track \({\rm{i}}\) and \({\rm{X = j}}\))\({\rm{ = P(X = j larm nowon i)}} \cdot {{\rm{p}}_{\rm{i}}}\). After the conditional probability is written in terms of \({{\rm{p}}_{\rm{1}}}{\rm{,}}...{\rm{,}}{{\rm{p}}_{{\rm{10}}}}\), by the law of total probability, the desired probability is obtained by summing over \({\rm{i}}\).)

Consider a deck consisting of seven cards, marked\({\rm{1,2, \ldots }}\),\({\rm{7}}\). Three of these cards are selected at random. Define an rv \({\rm{W}}\) by \({\rm{W = }}\) the sum of the resulting numbers, and compute the pmf of \({\rm{W}}\). Then compute \({\rm{\mu }}\) and\({{\rm{\sigma }}^{\rm{2}}}\). (Hint: Consider outcomes as unordered, so that \({\rm{(1,3,7)}}\) and \({\rm{(3,1,7)}}\) are not different outcomes. Then there are \({\rm{35}}\) outcomes, and they can be listed. (This type of rv actually arises in connection with a statistical procedure called Wilcoxon's rank-sum test, in which there is an \({\rm{x}}\) sample and a \({\rm{y}}\) sample and \({\rm{W}}\)is the sum of the ranks of the \({\rm{x}}\)'s in the combined sample)

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.