/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 35 A coin is tossed until the first... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

A coin is tossed until the first time a head turns up. If this occurs on the \(n\) th toss and \(n\) is odd you win \(2^{n} / n,\) but if \(n\) is even then you lose \(2^{n} / n\). Then if your expected winnings exist they are given by the convergent series $$1-\frac{1}{2}+\frac{1}{3}-\frac{1}{4}+\cdots$$ called the alternating harmonic series. It is tempting to say that this should be the expected value of the experiment. Show that if we were to do this, the expected value of an experiment would depend upon the order in which the outcomes are listed.

Short Answer

Expert verified
The expected value depends on the series' order, which is problematic as expected values should be consistent, reflecting issues with conditionally convergent series like the alternating harmonic series.

Step by step solution

01

Understand Expected Value in a Coin Toss

You win or lose based on whether the coin lands on heads on an odd (win) or even (lose) numbered toss. Calculate the expected value by considering all possible outcomes and their probabilities. The probability of having the first head on the \(n\)-th toss is \(\frac{1}{2^n}\).
02

Analyze Possible Outcomes

Consider the infinite series of outcomes: winning when \(n\) is odd gives \(\frac{2^n}{n}\) and losing when \(n\) is even gives \(-\frac{2^n}{n}\). This creates an alternating series for the expected value: \(1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \cdots\).
03

Series Convergence

The series \(1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \cdots\) is known as the alternating harmonic series and it converges to \(\ln(2)\). However, the nature of an alternating series means that rearranging the terms can change the sum.
04

Re-arranging the Series

Rearranging an alternating, conditionally convergent series such as the alternating harmonic series can change its sum. For instance, grouping several positive terms followed by negative ones can alter the accumulated sum over time.
05

Implications for Expected Value

In effect, the expected value you compute can be manipulated by altering the order of the gains and losses, which violates basic principles since expected values should not depend on order. It highlights that the infinite series should be treated with care in probabilistic contexts.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

Key Concepts

These are the key concepts you need to understand to accurately answer the question.

Alternating Harmonic Series
An alternating harmonic series is a type of infinite series where the sign of the terms alternates between positive and negative. The series given in the exercise
  • \[1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \cdots\]
is an example of an alternating harmonic series. Each term is the reciprocal of an integer, and the signs alternate with each subsequent term. This alternation of signs is what distinguishes it from a regular harmonic series, where all terms are positive.
Understanding alternating series is crucial since they behave differently compared to non-alternating series. One of the significant characteristics of such series is their convergence properties. Although each term becomes smaller, the direction of their accumulation flips back and forth, affecting the overall sum of the series. The alternating harmonic series is especially important in probability contexts, like in our coin toss example, as it can be used to define expectations.
Probability of Outcomes
When dealing with a sequence of probability outcomes, each potential result has a specific probability attached. In the context of the coin toss problem, the probability of the first head appearing on the \(n\)-th toss is given by
  • \[\frac{1}{2^n}\]
This calculation comes from the nature of the coin toss. Since the probability of heads on any single toss is \(\frac{1}{2}\), and no heads appeared before the \(n\)-th toss, the sequence is crucial.
The outcomes gained when considering these probabilities feed directly into the expected value. For example:
  • Odd \(n\) results in a gain \(\frac{2^n}{n}\)
  • Even \(n\) results in a loss \(\frac{2^n}{n}\)
Each of these possibilities combines with their probability to form an infinitely extending, alternating series, making the scenario fascinating yet challenging for determining expected values, as demonstrated in the series presented.
Infinite Series Convergence
Convergence is a central topic when discussing infinite series. For a series to converge, its terms must approach zero, with the partial sums approaching a specific limit. The alternating harmonic series
  • \[1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \cdots\]
is a famous example that converges. Remarkably, the sum of this series is known to be \(\ln(2)\).
However, the series' convergence does hinge on rigorous conditions. Due to the alternating nature, despite each term becoming minute, the fluctuation between positive and negative causes subtle shifts. Despite the oscillation, the series narrows its approach, confirming the idea that such a series "settles" around a central value.
In practice, understanding convergence is essential, highlighting that infinite probabilities or outcomes need careful consideration, particularly when probability results stem from summed, infinite series.
Conditional Convergence
Conditional convergence is a sophisticated yet crucial concept to grasp, particularly when using series in real-world scenarios. In simplified terms, a conditionally convergent series is one which converges but does not converge absolutely. Absolute convergence means that the series of absolute values converges, which is not the case for our alternating harmonic series.
What makes conditionally convergent series intriguing is that the order of the terms significantly impacts the sum. If you rearrange terms of such a series, you can potentially "coax" the sum into various numbers. In the coin toss problem, this implies that rearranging the odds of winnings and losses affects the expected value's ultimate sum.
For students, conditional convergence serves as a reminder about the complexity and potential pitfalls in probabilistic systems. It's a poignant note on why it's critical to handle ordered terms and finite probabilities with care, ensuring true values are represented irrespective of an initial listing or arrangement.

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

For a sequence of Bernoulli trials, let \(X_{1}\) be the number of trials until the first success. For \(j \geq 2,\) let \(X_{j}\) be the number of trials after the \((j-1)\) st success until the \(j\) th success. It can be shown that \(X_{1}, X_{2}, \ldots\) is an independent trials process. (a) What is the common distribution, expected value, and variance for \(X_{j} ?\) (b) Let \(T_{n}=X_{1}+X_{2}+\cdots+X_{n} .\) Then \(T_{n}\) is the time until the \(n\) th success. Find \(E\left(T_{n}\right)\) and \(V\left(T_{n}\right)\) (c) Use the results of (b) to find the expected value and variance for the number of tosses of a coin until the \(n\) th occurrence of a head.

Write a computer program to calculate the mean and variance of a distribution which you specify as data. Use the program to compare the variances for the following densities, both having expected value 0 : $$\begin{array}{l}p_{X}=\left(\begin{array}{ccccc} -2 & -1 & 0 & 1 & 2 \\\3 / 11 & 2 / 11 & 1 / 11 & 2 / 11 & 3 / 11\end{array}\right) \\\p_{Y}=\left(\begin{array}{ccccc} -2 & -1 & 0 & 1 & 2 \\\1 / 11 & 2 / 11 & 5 / 11 & 2 / 11 & 1 / 11\end{array}\right)\end{array}$$

(from Pittel \(^{17}\) ) Telephone books, \(n\) in number, are kept in a stack. The probability that the book numbered \(i\) (where \(1 \leq i \leq n\) ) is consulted for a given phone call is \(p_{i}>0,\) where the \(p_{i}\) 's sum to 1. After a book is used, it is placed at the top of the stack. Assume that the calls are independent and evenly spaced, and that the system has been employed indefinitely far into the past. Let \(d_{i}\) be the average depth of book \(i\) in the stack. Show that \(d_{i} \leq d_{j}\) whenever \(p_{i} \geq p_{j}\). Thus, on the average, the more popular books have a tendency to be closer to the top of the stack. Hint: Let \(p_{i j}\) denote the probability that book \(i\) is above book \(j\). Show that \(p_{i j}=p_{i j}\left(1-p_{j}\right)+p_{j i} p_{i}\)

Let \(X\) be a random variable with \(E(X)=\mu\) and \(V(X)=\sigma^{2}\). Show that the function \(f(x)\) defined by $$f(x)=\sum_{\omega}(X(\omega)-x)^{2} p(\omega)$$ has its minimum value when \(x=\mu\).

Show that, if \(X\) and \(Y\) are random variables taking on only two values each, and if \(E(X Y)=E(X) E(Y),\) then \(X\) and \(Y\) are independent.

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.