# Probability by Conditioning

The theory in this section isn’t new. It’s the old familiar multiplication rule. We are just going to use it in the context of processes indexed by time, in a method that we are going to call conditioning on early moves.

### Winning a Game of Dice

Suppose Jo and Bo play the following game. Jo rolls a die, then Bo rolls it, then Jo rolls again, and so on, until the first time one of them gets the face with six spots. That person is the winner.

Question. What is the chance that Jo wins?

Answer. Before you do any calculations, notice that the game isn’t symmetric in the two players. Jo has the advantage of going first, and could win on the first roll. So the probability that Jo wins should be greater than half.

To see exactly what it is, notice that there’s a natural recursion or “renewal” in the setup. For Jo to win, we can condition on the first two moves as follows:

• either Jo wins on Roll 1;
• or Jo gets a non-six on Roll 1, then Bo gets a non-six on Roll 2, and then the game starts over and Jo wins.

So at Time 0 (that is, before the dice are rolled), let $x$ be the chance that Jo is the winner. Then $x$ satisfies an equation:

This is easy to solve.

which is greater than half as we had guessed.

### Gambler’s Ruin: Fair Coin

Let $a$ and $b$ be two positive integers. Suppose a gambler starts with $a$ dollars and bets on the tosses of a coin. Every time the coin lands heads, the gambler wins a dollar. Every time it lands tails, the gambler loses a dollar.

Now suppose the gambler has a stopping rule: he will stop once his net gain is $b$ dollars or he has no money left, whichever happens first. If the gambler ends up when has no money, he is ruined. Our goal in this example is to find the probability that the gambler is ruined.

At each toss we will keep track of the gambler’s net gain. So he will start out at 0 and stop when the he gets to $b$ or $-a$, whichever happens first.