/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 35 (a) Prove that $$ E[X]=E[X \... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

(a) Prove that $$ E[X]=E[X \mid X0$ $$ P\\{X \geq a\\} \leq \frac{E[X]}{a} $$

Short Answer

Expert verified
To prove part (a), let's define a new random variable \(Y\) as: \[ Y = \begin{cases} a & \text{if } X \geq a \\ X & \text{if } X < a \end{cases} \] Now, we can express \(E[X]\) as the sum of the two conditional expectations (\(E[X \mid X 0\): \[P\{X \geq a\} \leq \frac{E[X]}{a}\] By following through the inequalities and rearranging them, we can prove Markov's Inequality.

Step by step solution

01

Define a random variable

First, let's define a new random variable \(Y\) as: \[ Y = \begin{cases} a & \text{if } X \geq a \\ X & \text{if } X < a \end{cases} \] Now, we can write the expectation of \(X\) as: \[E[X] = E[X \cdot I\{X<a\} + Y \cdot I\{Y \geq a\}]\], where \(I\{...\}\) denotes the indicator function.
02

Calculate \(E[X \mid X

To calculate \(E[X \mid X<a]\) and \(E[X \mid X \geq a]\), we can use the definition of conditional expectation: \[E[X \mid X<a] = \frac{E[X \cdot I\{X<a\}]}{P\{X<a\}}\] and \[E[X \mid X \geq a] = \frac{E[Y \cdot I\{Y \geq a\}]}{P\{X \geq a\}}\]
03

Prove Part (a)

Now, we can express \(E[X]\) as the sum of the two conditional expectations multiplied by their respective probabilities, according to our definition of \(Y\): \[E[X] = E[X \cdot I\{X<a\}] + E[Y \cdot I\{Y \geq a\}]\] \[E[X] = E[X \mid X<a] \cdot P\{X<a\} + E[X \mid X \geq a] \cdot P\{X \geq a\}\] This proves part (a).
04

Prove Markov's Inequality

Now, let's use the result from part (a) to prove Markov's inequality, which states that if \(P\{X \geq 0\} = 1\), then for \(a > 0\): \[P\{X \geq a\} \leq \frac{E[X]}{a}\] From part (a), we have: \[E[X]=E[X \mid X<a] P\{X<a\}+E[X \mid X \geq a] P\{X \geq a\}\] Since \(X \geq 0\), we have \(E[X \mid X<a] \geq 0\). Also, when \(X \geq a\), we have \(E[X \mid X \geq a] \geq a\). Hence, \[E[X] \geq E[X \mid X<a] P\{X<a\} + a \cdot P\{X \geq a\}\] Now, we rearrange the inequality: \[P\{X \geq a\} \leq \frac{E[X] - E[X \mid X<a] P\{X<a\}}{a}\] Since the numerator is non-negative, the inequality holds: \[P\{X \geq a\} \leq \frac{E[X]}{a}\] This proves the Markov's inequality as stated in part (b).

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

Um 1 contains 5 white and 6 black balls, while um 2 contains 8 white and 10 black balls. Two balls are randomly selected from um 1 and are then put. in urn 2 . If 3 balls are then randomly selected from urn 2 , compute the expected number of white balls in the trio. HNT: Let \(X_{i}=1\) if the \(i\) th white ball initially in urn 1 is one of the three selected, and let \(X_{i}=0\) otherwise. Similarly, let \(Y_{i}=1\) if the ith white ball from urn 2 is one of the three selected, and let \(Y_{i}=0\) otherwise. The number of white balls in the trio can now be written as \(\sum_{1}^{5} X_{i}+\sum_{1}^{8} Y_{i}\).

A certain region is inhabited by \(r\) distinct types of a certain kind of insect species, and each insect caught will, independently of the types of the previous catches, be of type \(i\) with probability $$ P_{i}, i=1, \ldots, r \quad \sum_{1}^{r} P_{i}=1 $$ (a) Compute the mean number of insects that are caught before the first type 1 catch. (b) Compute the mean number of types of insects that are caught before the first type 1 catch.

The random variables \(X\) and \(Y\) have a joint density function given by $$ f(x, y)= \begin{cases}2 e^{-2 x / x} & 0 \leq x<\infty, 0 \leq y \leq x \\ 0 & \text { otherwise }\end{cases} $$ Compute \(\operatorname{Cov}(X, Y)\).

Consider a gambler who at each gamble either wins or loses her bet with probabilities \(p\) and \(1-p\). When \(p>\frac{1}{2}\), a popular gambling system, known as the Kelley strategy, is to always bet the fraction \(2 p-1\) of your current fortune. Compute the expected fortune after \(n\) gambles of a gambler who starts with \(x\) units and employs the Kelley strategy.

The positive random variable \(X\) is said to be a lognormal random variable with parameters \(\mu\) and \(\sigma^{2}\) if \(\log (X)\) is a normal random variable with mean \(\mu\) and variance \(\sigma^{2}\). Use the normal moment generating function to find the mean and variance of a lognormal random variable.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.