Chapter 4: Problem 66
For a branching process, calculate \(\pi_{0}\) when (a) \(P_{0}=\frac{1}{4}, P_{2}=\frac{3}{4}\). (b) \(P_{0}=\frac{1}{4}, P_{1}=\frac{1}{2}, P_{2}=\frac{1}{4}\). (c) \(P_{0}=\frac{1}{6}, P_{1}=\frac{1}{2}, P_{3}=\frac{1}{3}\).
/*! This file is auto-generated */ .wp-block-button__link{color:#fff;background-color:#32373c;border-radius:9999px;box-shadow:none;text-decoration:none;padding:calc(.667em + 2px) calc(1.333em + 2px);font-size:1.125em}.wp-block-file__button{background:#32373c;color:#fff;text-decoration:none}
Learning Materials
Features
Discover
Chapter 4: Problem 66
For a branching process, calculate \(\pi_{0}\) when (a) \(P_{0}=\frac{1}{4}, P_{2}=\frac{3}{4}\). (b) \(P_{0}=\frac{1}{4}, P_{1}=\frac{1}{2}, P_{2}=\frac{1}{4}\). (c) \(P_{0}=\frac{1}{6}, P_{1}=\frac{1}{2}, P_{3}=\frac{1}{3}\).
All the tools & learning materials you need for study success - in one app.
Get started for free
A particle moves among \(n+1\) vertices that are situated on a circle in the following manner. At each step it moves one step either in the clockwise direction with probability \(p\) or the counterclockwise direction with probability \(q=1-p\). Starting at a specified state, call it state 0 , let \(T\) be the time of the first return to state \(0 .\) Find the probability that all states have been visited by time \(T\).
Trials are performed in sequence. If the last two trials were successes, then the next trial is a success with probability \(0.8\); otherwise the next trial is a success with probability \(0.5\). In the long run, what proportion of trials are successes?
Suppose in the gambler's ruin problem that the probability of winning a bet depends on the gambler's present fortune. Specifically, suppose that \(\alpha_{i}\) is the probability that the gambler wins a bet when his or her fortune is \(i .\) Given that the gambler's initial fortune is \(i\), let \(P(i)\) denote the probability that the gambler's fortune reaches \(N\) before \(0 .\) (a) Derive a formula that relates \(P(i)\) to \(P(i-1)\) and \(P(i+1)\). (b) Using the same approach as in the gambler's ruin problem, solve the equation of part (a) for \(P(i)\). (c) Suppose that \(i\) balls are initially in urn 1 and \(N-i\) are in urn 2, and suppose that at each stage one of the \(N\) balls is randomly chosen, taken from whichever urn it is in, and placed in the other urn. Find the probability that the first urn becomes empty before the second.
A group of \(n\) processors is arranged in an ordered list. When a job arrives, the first processor in line attempts it; if it is unsuccessful, then the next in line tries it; if it too is unsuccessful, then the next in line tries it, and so on. When the job is successfully processed or after all processors have been unsuccessful, the job leaves the system. At this point we are allowed to reorder the processors, and a new job appears. Suppose that we use the one- closer reordering rule, which moves the processor that was successful one closer to the front of the line by interchanging its position with the one in front of it. If all processors were unsuccessful (or if the processor in the first position was successful), then the ordering remains the same. Suppose that each time processor \(i\) attempts a job then, independently of anything else, it is successful with probability \(p_{i}\) (a) Define an appropriate Markov chain to analyze this model. (b) Show that this Markov chain is time reversible. (c) Find the long-run probabilities.
It follows from Theorem \(4.2\) that for a time reversible Markov chain $$ P_{i j} P_{j k} P_{k i}=P_{i k} P_{k j} P_{j i}, \quad \text { for all } i, j, k $$ It turns out that if the state space is finite and \(P_{i j}>0\) for all \(i, j\), then the preceding is also a sufficient condition for time reversibility. [That is, in this case, we need only check Equation (4.26) for paths from \(i\) to \(i\) that have only two intermediate states.] Prove this.
What do you think about this solution?
We value your feedback to improve our textbook solutions.