/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none} Problem 30 A person tosses a fair coin unti... [FREE SOLUTION] | 91Ó°ÊÓ

91Ó°ÊÓ

A person tosses a fair coin until a tail appears for the first time. If the tail appears on the \(n\) th flip, the person wins \(2^{n}\) dollars. Let \(X\) denote the player's winnings. Show that \(E[X]=+\infty .\) This problem is known as the St. Petersburg paradox. (a) Would you be willing to pay \(\$ 1\) million to play this game once? (b) Would you be willing to pay \(\$ 1\) million for each game if you could play for as long as you liked and only had to settle up when you stopped playing?

Short Answer

Expert verified
The expected value of the player's winnings in the St. Petersburg paradox is \(\infty\). However, despite the infinite expected value, many people would not be willing to pay $1 million to play the game once or repeatedly, as the most probable outcomes provide much lower winnings compared to the cost, and the small probability of extremely large winnings does not compensate that risk. If allowed to play for as long as desired, some players may be more willing to pay the cost, although the potential for significant loss remains a deterrent.

Step by step solution

01

Compute the probability of each outcome

Let's first find out the probability of the tail appearing on the \(n\)th flip. Since the coin is fair, the probability of getting tails on any single flip is \(0.5\), and the probability of getting \(n-1\) heads in a row before the first tail appears is \((0.5)^{n-1}\). For example, if \(n=3\), the probability of the outcome is \(0.5^2 = 0.25\) because there are two coin flips with heads.
02

Calculate the winnings corresponding to each outcome

For each outcome, we are given that the person wins \(2^n\) dollars. Given a certain number of flips, \(n\), we can calculate the winnings as \(2^n\) dollars.
03

Compute the expected value of the winnings

The expected value of the player's winnings is the sum of the expected winnings for each outcome. For each outcome, we multiply the probability of the outcome with the corresponding winnings. Mathematically, we can write this as: \[E[X] = \sum_{n=1}^{\infty} P(\text{tail on }n\text{th flip}) \times (\text{winnings on }n\text{th flip})\] \[E[X] = \sum_{n=1}^{\infty} (0.5)^{n-1} \times (2^n)\] Next, let's simplify the expression above: \[E[X] = \sum_{n=1}^{\infty} (\frac{1}{2})^{n-1} \times 2^n\] \[E[X] = \sum_{n=1}^{\infty} 2\] Since the summation diverges, the expected value of the player's winnings is \(\infty\). This means that, theoretically, over an infinite amount of games, the player should have infinite winnings, which is referred to as the St. Petersburg paradox.
04

Answer questions (a) and (b)

(a) Considering that the expected value of the winnings is infinite, it might seem that paying \(1\) million dollars to play the game once is a good idea. However, people usually refuse this offer because the potential loss of \(1\) million dollars is significant. Moreover, the most probable outcomes of the game provide much lower winnings compared to the cost, and the small probability of extremely large winnings does not compensate that risk. (b) If a player had the option to play the game for as long as they liked and only settle up when they stopped playing, they might be more willing to pay \(1\) million for each game. In this scenario, the chances of having a large winning increase as the number of games also increases, and it might help players to offset the costs. However, people still may not be willing to take the risk, as the potential loss could be significant if they need to stop playing earlier than they planned.

Unlock Step-by-Step Solutions & Ace Your Exams!

  • Full Textbook Solutions

    Get detailed explanations and key concepts

  • Unlimited Al creation

    Al flashcards, explanations, exams and more...

  • Ads-free access

    To over 500 millions flashcards

  • Money-back guarantee

    We refund you if you fail your exam.

Over 30 million students worldwide already upgrade their learning with 91Ó°ÊÓ!

Key Concepts

These are the key concepts you need to understand to accurately answer the question.

Expected Value
When we talk about the expected value in probability, we're essentially discussing a type of average or mean that is weighted based on all possible outcomes and their probabilities. In the context of the St. Petersburg paradox, the expected value helps us estimate what a player might win on average over a long period.

For the described game, the expected value involves considering all the outcomes (when the first tail appears) multiplied by their reward (which is calculated as \(2^n\)). Then, we sum these products over all possible outcomes from one onward, as given by the infinite series: \[E[X] = \sum_{n=1}^{\infty} (0.5)^{n-1} \times (2^n)\]

If we simplify that, you find that the expected value for this game turns out to be infinite. This surprises many at first, as it suggests that theoretically, playing this game endlessly would result in infinitely large winnings. However, this expected value doesn't accurately convey a real person's decision-making process.
Infinite Series
An infinite series in mathematics is a sum of an infinite sequence of numbers. In many cases, we can evaluate these series to find a finite sum, or instead, they may diverge (grow infinitely large).

In the St. Petersburg paradox, you're dealing with an infinite series to calculate the expected value:\[\sum_{n=1}^{\infty} 2\]

The issue here is that this doesn't converge to a specific number but rather grows without bounds, which is why the expected value comes out to be infinite. Usually, summations are used in practical cases to find a finite mean or expectation, but in paradoxes like this, peculiar results arise.

This is a great example of how infinite series can sometimes lead to surprising conclusions, especially within the contexts of gambling problems or other scenarios where infinite possibilities are considered.
Probability Theory
Probability theory is the branch of mathematics that deals with the analysis of random events and is foundational when discussing topics like the expected value and infinite series. It provides the tools necessary to assess and predict the likelihood of different outcomes.

In probability theory, we define the probability of specific outcomes, such as flipping tails with a coin, as well as combinations of these outcomes occurring in sequence. In the game described in the St. Petersburg paradox, the probability for a tail appearing for the first time on the \(n\)th flip is calculated as \((0.5)^{n-1}\).

Understanding these probabilities is crucial as they form the basis upon which the expected value and its associated formulas are built. It helps explain why, despite the theoretically infinite expected winnings, human intuition and decision-making might diverge for practical and psychological reasons.
Gambling Paradoxes
Gambling paradoxes, like the St. Petersburg paradox, present intriguing situations where the mathematical solution appears counterintuitive or impractical for real-life decision-making.

The St. Petersburg paradox particularly fascinates economists and philosophers because it challenges traditional theories about rational decision-making. How can it make sense to pay large amounts of money for the opportunity of winning infinite rewards, when the likelihood of actually receiving large payouts in a few tries is slim?

These paradoxes make us explore beyond mere numerical expectations and consider psychological, economic, and practical factors that affect human behavior. They show that reality doesn't always align neatly with mathematical theory, prompting a reassessment of how we perceive risk and reward.

One App. One Place for Learning.

All the tools & learning materials you need for study success - in one app.

Get started for free

Most popular questions from this chapter

In Problem \(4.5,\) for \(n=3,\) if the coin is assumed fair, what are the probabilities associated with the values that \(X\) can take on?

\(A\) and \(B\) will take the same 10 -question examination. Each question will be answered correctly by \(A\) with probability.7, independently of her results on other questions. Each question will be answered correctly by \(B\) with probability \(.4,\) independently both of her results on the other questions and on the performance of \(A\) (a) Find the expected number of questions that are answered correctly by both \(A\) and \(B\). (b) Find the variance of the number of questions that are answered correctly by either \(A\) or \(B\)

Each night different meteorologists give us the probability that it will rain the next day. To judge how well these people predict, we will score each of them as follows: If a meteorologist says that it will rain with probability \(p,\) then he or she will receive a score of \(1-(1-p)^{2} \quad\) if it does rain \(1-p^{2}\) if it does not rain We will then keep track of scores over a certain time span and conclude that the meteorologist with the highest average score is the best predictor of weather. Suppose now that a given meteorologist is aware of our scoring mechanism and wants to maximize his or her expected score. If this person truly believes that it will rain tomorrow with probability \(p^{*},\) what value of \(p\) should he or she assert so as to maximize the expected score?

Two athletic teams play a series of games; the first team to win 4 games is declared the overall winner. Suppose that one of the teams is stronger than the other and wins each game with probability \(.6,\) independently of the outcomes of the other games. Find the probability, for \(i=4,5,6,7,\) that the stronger team wins the series in exactly \(i\) games. Compare the probability that the stronger team wins with the probability that it would win a 2-outof-3 series.

Consider a roulette wheel consisting of 38 numbers 1 through \(36,0,\) and double \(0 .\) If Smith always bets that the outcome will be one of the numbers 1 through \(12,\) what is the probability that (a) Smith will lose his first 5 bets; (b) his first win will occur on his fourth bet?

See all solutions

Recommended explanations on Math Textbooks

View all explanations

What do you think about this solution?

We value your feedback to improve our textbook solutions.

Study anywhere. Anytime. Across all devices.