Chapter 3: Problem 118
If \(X\) is a hypergeometric rv, show directly from the definition that \(E(X)=n
M / N\) (consider only the case \(n
Short Answer
Expert verified
The expected value of \(X\) is \(E(X) = \frac{nM}{N}\).
Step by step solution
01
Understanding the Hypergeometric Random Variable
A hypergeometric random variable (rv) \(X\) is used to describe the number of successes in a sequence of \(n\) draws from a finite population of \(N\) elements containing \(M\) successes. Each draw is without replacement. Hence, the probability mass function (pmf) of \(X\) is given by \(P(X = x) = \frac{{\binom{M}{x}\binom{N-M}{n-x}}}{\binom{N}{n}}\).
02
Setting Up the Expected Value Formula
The expected value of a random variable \(X\) is the sum of each value of \(X\) weighted by its probability:\[E(X) = \sum_{x=0}^{n} x \cdot P(X = x)\]Since we're investigating a hypergeometric distribution, substitute its pmf:\[E(X) = \sum_{x=0}^{n} x \cdot \frac{{\binom{M}{x}\binom{N-M}{n-x}}}{\binom{N}{n}}\]
03
Factor Out \(\frac{nM}{N}\)
To show that \(E(X)=\frac{nM}{N}\), we want to factor \(\frac{nM}{N}\) out of the expression. Let's alter the sum index, \(y=x-1\), where \(x = y + 1\). Then:\[E(X) = \sum_{y=0}^{n-1} (y+1) \cdot \frac{{\binom{M}{y+1}\binom{N-M}{n-y-1}}}{\binom{N}{n}}\]Distribute \(y+1\):\[E(X) = \sum_{y=0}^{n-1} \frac{{(y+1) \cdot M \cdot \binom{M-1}{y}\binom{N-M}{n-y-1}}}{N \cdot \binom{N-1}{n-1}}\]
04
Observe the form \(h(y; n-1, M-1, N-1)\)
The sum terms are of the form:\[(y+1) \cdot \frac{{\binom{M-1}{y}\binom{N-M}{n-y-1}}}{\binom{N-1}{n-1}}\]This is equivalent to the PDF of another hypergeometric random variable defined with parameters \( n-1 \), \( M-1 \), and \( N-1 \). Thus:\[E(Y) = \frac{(n-1)(M-1)}{N-1}\]
05
Conclude E(X) Expression
Substitute back to get:\[E(X) = \frac{n M}{N} \cdot \frac{(n-1)(M-1)}{n(n-1)} \cdot \sum_{y=0}^{n-1} h(y; n-1, M-1, N-1)\]Observe that this simplifies directly to:\[E(X) = \frac{n M}{N}\]Thus, the expected value of the hypergeometric random variable is \(\frac{nM}{N}\).
Unlock Step-by-Step Solutions & Ace Your Exams!
-
Full Textbook Solutions
Get detailed explanations and key concepts
-
Unlimited Al creation
Al flashcards, explanations, exams and more...
-
Ads-free access
To over 500 millions flashcards
-
Money-back guarantee
We refund you if you fail your exam.
Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Probability Mass Function
The probability mass function (pmf) of a hypergeometric distribution gives us the probability of having exactly a certain number of successes, say \( x \), in \( n \) draws from a finite population \( N \) that contains \( M \) successes. Here's the key formula for the pmf:
- \( P(X = x) = \frac{{\binom{M}{x} \binom{N-M}{n-x}}}{\binom{N}{n}} \)
- \( \binom{M}{x} \) calculates the ways to choose \( x \) successes out of \( M \) total successes.
- \( \binom{N-M}{n-x} \) calculates the ways to choose the other \( n-x \) from the remaining \( N-M \).
- The denominator \( \binom{N}{n} \) calculates the total ways to choose \( n \) elements from \( N \), ensuring all configurations are considered.
Expected Value
The expected value (or mean) of a hypergeometric random variable gives an average number of successes one would expect after making \( n \) draws. For the hypergeometric distribution, the expected value is:\[E(X) = \frac{nM}{N}\]This formula reflects the relationship between the size of the sample \( n \), the number of successes \( M \) in the population, and the population size \( N \). Here's how this works intuitively:
- \( n \): the number of draws or trials from the population.
- \( M/N \): the proportion of the population that are successes.
- Thus, \( n \times (M/N) \): tells you how many successes you'd expect based on the sample size and success ratio.
Hypergeometric Random Variable
A hypergeometric random variable \( X \) models specific situations where we're interested in the exact count of successful outcomes in a limited number of draws. What makes this distribution unique is its assumption about the sampling process.
- "Without replacement": meaning the member chosen is not replaced back into the population before the next draw.
- This contrasts with other distributions like the binomial distribution, which involves "with replacement" and thus independent trials.
- Hypergeometric models scenarios with a fixed population and known quantities of interest like successes and failures in advance.