Chapter 23: Problem 14
Let \(Z\) be defined on \((\Omega, \mathcal{F}, P)\) with \(Z \geq 0\) and \(E\\{Z\\}=1 .\) Define a new probability \(Q\) by \(Q(A)=E\left\\{1_{A} Z\right\\} .\) Let \(\mathcal{G}\) be a sub \(\sigma\)-algebra of \(\mathcal{F}\). and let \(U=E\\{Z \mid \mathcal{G}\\}\). Show that \(E_{Q}\\{X \mid \mathcal{G}\\}=\frac{E\\{X Z \mid \mathcal{G}\\}}{U}\). for any bounded \(\mathcal{F}\)-measurable random variable \(X\). (Here \(E_{Q}\\{X \mid \mathcal{G}\\}\) denotes the conditional expectation of \(X\) relative to the probability measure \(Q .)\)
Short Answer
Step by step solution
Understand the Given Information
Recall Definition of Conditional Expectation under Measure Q
Express Integral under Q Measure in Terms of P Measure
Use Conditional Expectation Properties
Manipulate Expectation Expressions
Conclusion
Unlock Step-by-Step Solutions & Ace Your Exams!
-
Full Textbook Solutions
Get detailed explanations and key concepts
-
Unlimited Al creation
Al flashcards, explanations, exams and more...
-
Ads-free access
To over 500 millions flashcards
-
Money-back guarantee
We refund you if you fail your exam.
Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!
Key Concepts
These are the key concepts you need to understand to accurately answer the question.
Probability Measure
- Sample Space (\(\Omega\)
- Event Space (\(\mathcal{F}\)
- Probability Measure (\(P\)
- The probability of the entire sample space is 1 (\(P(\Omega) = 1\))
- The probability of any event is non-negative (\(P(A) \geq 0\)
- The probability of the union of mutually exclusive events is the sum of their individual probabilities (\(P(A \cup B) = P(A) + P(B)\) if A and B are disjoint)
Sub Sigma-Algebra
- It contains the empty set (\(\emptyset\)) and the entire space (\(\Omega\))
- If an event is in \(\mathcal{G}\), its complement must also be in \(\mathcal{G}\)
- If a sequence of events is in \(\mathcal{G}\), the countable union of these events is in \(\mathcal{G}\)
Random Variables
- Discrete Random Variables: These take on a countable number of distinct values. Think of rolling a die.
- Continuous Random Variables: These can assume any value over a continuous range. Consider the exact height of students in a class.
- The mean (\(\mu\)): Expected value of the distribution.
- The variance (\(\sigma^2\) ): Measure of how much the values are spread out.
Expectation Properties
- Linearity: \(E[aX + bY] = aE[X] + bE[Y]\) , where a and b are constants and X, Y are random variables.
- Monotonicity: If \(X \leq Y\) then \(E[X] \leq E[Y]\) .
- Non-negativity: If X is a non-negative random variable, \(E[X] \geq 0\)