## Expected Waiting Times

### Expected Waiting Times¶

Let's find some expectations by conditioning. All of the calculations below involve conditioning on early moves of a random process.

### Waiting till H¶

A coin lands heads with chance $p$. Let's call it a $p$-coin for short. Let $X$ be the number of tosses of a $p$-coin till the first head appears.

As you have seen in exercises, $X$ has the geometric $(p)$ distribution on $1, 2, 3, \ldots$. By using the tail sum formula for expectation, you have showed that $E(X) = \frac{1}{p}$. Here is a quicker way to arrive at the same result.

The method is based on representing $X$ as a mixture of two other random variables:

• With probability $p$, $X=1$.
• With the remaining probability $q = 1-p$, the first toss is a tail, and then the process starts over independently of what has happened before. That is, with probability $q$, $X = 1 + X^*$ where $X^*$ is an independent copy of $X$.

Therefore, by averaging conditional expectations,

$$E(X) = pE(1) ~ + ~ qE(1+X^*) = p + q(1+E(X^*)) = p + q(1+E(X))$$

Solve for $E(X)$: $$E(X) = \frac{1}{p}$$

### Waiting Till Both Faces Have Appeared¶

Suppose you toss the $p$-coin until both faces have appeared. Let $N$ be the number of tosses.

Question. What is $E(N)$?

Answer. By conditioning on the first toss:

• With probability $p$ the first toss is a head, so $N = 1 + W_T$ where $W_T$ has the geometric $(q)$ distribution.
• With probability $q$ the first toss is a tail, so $N = 1 + W_H$ where $W_H$ has the geometric $(p)$ distribution.

So

$$E(N) = p\big{(}1 + \frac{1}{q} \big{)} + q\big{(}1 + \frac{1}{p} \big{)} = 1 + \frac{p^2 + q^2}{pq} = \frac{1 - pq}{pq}$$

### Waiting till HH¶

In tosses of a $p$-coin, let $W_{HH}$ be the number of tosses till you see two heads in a row.

Question. What is $E(W_{HH})$?

Answer 1. We can find this is several ways. One way is by conditioning on the first two tosses.

• With probability $q$, the first toss is a tail, so $W_{HH} = 1 + W^*$ where $W^*$ is an independent copy of $W_{HH}$.
• With probability $pq$ the first two tosses are HT, and $W_{HH} = 2 + W^{**}$ where $W^{**}$ is an independent copy of $W_{HH}$.
• With probability $p^2$, the first two tosses are heads, and $W_{HH} = 2$.

So if $x = E(W_{HH})$ then $$x = q(1+x) + pq(2+x) + p^22$$

So $$x = \frac{q + 2pq + 2p^2}{1 - q - pq} = \frac{1+p}{p^2}$$ by repeatedly using $p + q = 1$.

Answer 2. Another way is by conditioning on $X$, the number of tosses till the first head. This is the same random variable $X$ as in the previous example. Notice that $W_{HH} = X + Y$ where:

• With probability $p$, the toss after $X$ is a head, so $Y = 1$.
• With probability $q$, the toss after $X$ is a tail, so $Y = 1 + W^*$ where $W^*$ is an independent copy of $W_{HH}$.

So if $x = E(W_{HH})$ then $$x = E(X) + E(Y) = \frac{1}{p} + p + q(1 + x)$$ So $$px = \frac{1}{p} + 1 ~~~~ \text{and hence} ~~~~ x = \frac{1+p}{p^2}$$ as before.

### Gambler's Ruin: Duration of the Game¶

Let's return to the setting of the gambler's ruin problem with a fair coin. The gambler starts with $\$a$and bets on a fair coin till he reaches either$\$b$ or $\$0$. Let$T$be the duration of the game. Question. What is$E_a(T)$, the expected duration of the game given that the gambler starts with$\$a$?

Answer. Let $E_k(T)$ denote the expected duration of the game given that the gambler starts with $\$k$. Then for$1 \le k \le b-1$, $$E_k(T) = 1 + \frac{1}{2}E_{k-1}T + \frac{1}{2} E_{k+1}T$$ where the edge cases are $$E_0(T) = 0 = E_b(T)$$ You can check that the function$f(k) = k(b-k)$satisfies this recursion, and hence that$E_a(T) = a(b-a)\$.