Chapter 2: Problem 17
Let \(\xi, \xi_{1}, \xi_{2}, \ldots\) be nonnegative integrable random variables such that \(\mathrm{E} \xi_{n} \rightarrow \mathrm{E} \xi\) and \(P\left(\xi-\xi_{n}>\varepsilon\right) \rightarrow 0\) for every \(\varepsilon>0\). Show that then \(E\left|\xi_{n}-\xi\right| \rightarrow 0, n \rightarrow \infty .\)
Short Answer
Step by step solution
Understand Given Conditions
Use Convergence in Probability
Apply Continuous Mapping Theorem
Convergence of Expected Values
Unlock Step-by-Step Solutions & Ace Your Exams!
-
Full Textbook Solutions
Get detailed explanations and key concepts
-
Unlimited Al creation
Al flashcards, explanations, exams and more...
-
Ads-free access
To over 500 millions flashcards
-
Money-back guarantee
We refund you if you fail your exam.
Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Convergence in probability
- \( P(|\xi_n - \xi| > \varepsilon) \rightarrow 0 \) as \( n \rightarrow \infty \), for every \( \varepsilon > 0 \).
Dominated Convergence Theorem
- The sequence of random variables must converge almost everywhere or in probability to some limit random variable.
- The sequence should be bounded by an integrable function (dominating function), meaning every term of the sequence is less than or equal to this dominating function.
- \( \lim_{n \rightarrow \infty} \mathrm{E}[\xi_n] = \mathrm{E}[\lim_{n \rightarrow \infty} \xi_n] \).
Expected value
- \( \mathrm{E}[X] = \sum_{i} x_i p_i \)
- \( \mathrm{E}[X] = \int_{-\infty}^{\infty} x f(x) dx \)
Probability theory
- Random variables: Quantities that take on various values, each associated with certain probabilities.
- Probability distributions: Functions describing the likelihood of different outcomes of random variables.
- Convergence: As in our exercise, convergence helps us study the behavior of sequences of random variables over time, predicting outcomes as data grow large.