Returns and First Passage Times

Interact

Returns and First Passage Times

Throughout this section we will assume that $X_0, X_1, X_2, \ldots $ is an irreducible aperiodic Markov Chain with a finite state space. We will start by listing some facts about the number of times this chain visits each state. We won't prove the facts but will indicate why they are true. You should know the facts even if you don't study the sketched justifications.

  • For large $n$, all elements of the $n$-step transition matrix are positive. That is, for sufficiently large $n$, $P_n(i, j) > 0$ for all $i$ and $j$.

To see why this is true, remember that for each pair of states $i$ and $j$, the sequence $P_n(i, j)$ converges to $\pi(j) > 0$ as $n$ gets large. Therefore the terms in the sequence are positive for large $n$. As there is a finite number of pairs of states, there is a finite number of sequences of transition probabilities, and so after some large enough $n$, all the $n$-step transition probabilities are positive.

  • Each state is visited infinitely many times with probability 1. We say that each state is recurrent. As all the states are recurrent, we say for short that the chain is recurrent.

Observe that because the state space is finite, at least one state has to be visited infinitely often. It can be derived from this that since all states communicate with each other, all states must be visited infinitely often.

  • Fix any states $i$ and $j$ (which could be the same). Given that the chain starts at $i$, the expected time till it reaches $j$ is finite. In particular, given that the chain starts at $i$, the expected return time to $i$ is finite. We say that all states are positive recurrent, and hence that the chain is positive recurrent.

To see why this is true, note from above that for any state $i$, there is some time $n^*$ for which $P_n^*(i, i)$ is positive. Start the chain at $i$ and watch the chain only at times that are multiples of $n^*$. At these times, the number of moves till the chain returns to $i$ is a geometric random variable with a positive "success" probability $P_{n^*}(i, i)$, and hence it has finite expectation. If you watch the chain at all times, not just at multiples of $n^*$, it will return to $i$ even faster.

Expected Long Run Occupation Proportions

We know that the chain has a stationary distribution $\pi$ that is unique and strictly positive. We also know that for every state $j$, the expected long run proportion of time the chain spends at $j$ is $\pi(j)$. We call this the expected long run proportion of times at which the chain occupies the state $j$.

First Passage Times

We will start with hitting times defined as follows. For any state $j$, the first hitting time or the first passage time of $j$ is

$$ W_j = \inf \{ n \ge 1: X_n = j \} $$

That is, $W_j$ is the first time at which the chain reaches state $j$ once it has started running. We will be lazy and call $W_j$ a hitting time instead of a first hitting time, but we will make sure to use first in contexts where we are studying repeated hits on $j$.

We have seen these examples of these $W$'s before. In the last section we worked with the hitting time till a pattern appeared. You will recall that the pattern was one of the states of a Markov Chain that we set up.

The expected hitting time of $j$ given that the chain started at $i$ is $E(W_j \mid X_0 = i)$. This is also called the mean first passage time of $j$ given that the chain started in $i$. We can calculate these using first step analysis, but when $i = j$ there is a simple expression for the expectation in terms of the stationary distribution $\pi$.

Expected Return Time

In the case where $i = j$, we say that $E(W_i \mid X_0 = i)$ is the expected return time to $i$ given that the chain started at $i$. That is because the definition of $W_i$ only involves times that are at least 1.

It turns out that there is a simple relation between $\pi(i)$ and the expected return time to $i$. We won't prove the relation but it is not hard to believe:

$$ E(W_i \mid X_0 = i) = \frac{1}{\pi(i)} , ~~~~ i \in S $$

Since $\pi(i) > 0$ for all states $i$, the expected return time to each state is finite.

To understand what this result is saying, suppose $\pi(i) = 1/10$. Then you know that in the long run the chain spends $1/10$ of its time at $i$. Since it's at $i$ at one of every 10 steps on average, it makes sense that once the chain is at $i$ you expect it to return to $i$ in 10 steps. This is not a proof by any means, but it confirms that the result is consistent with intuition.

This result is analogous to, but not the same as, the fact that in i.i.d. Bernoulli $(p)$ trials, the expected long run proportion of successes is $p$, and the expected number of trials till the first success is $1/p$.

Example: Ehrenfest Chain

Recall that in the Ehrenfest model described in an earlier chapter, there are two containers containing a total of $N$ particles. At each instant, a container is selected at random and a particle is selected at random independently of the container. Then the selected particle is placed in the selected container; if it was already in that container, it stays there.

Let $X_n$ be the number of particles in Container 1 at time $n$. Then $X_0, X_1, \ldots$ is a Markov chain with transition probabilities given by:

\begin{equation} P(i, j) = \begin{cases} \frac{N-i}{2N} & \text{if } j = i+1 \\ \frac{1}{2} & \text{if } j = i \\ \frac{i}{2N} & \text{if } j = i-1 \\ 0 & \text{otherwise} \end{cases} \end{equation}

We showed that the stationary distribution of this chain is binomial $(N, 1/2)$.

Now our result about expected hitting times implies that given that Containter 1 is empty, the expected amount of time till it again becomes empty is

$$ \frac{1}{\binom{N}{0}(1/2)^N} = 2^N $$

In general for any $0 \le k \le N$, given that Container 1 has $k$ particles in it, the expected amount of time till it once again has $k$ particles is

$$ \frac{1}{\binom{N}{k}(1/2)^N} = \frac{2^N}{\binom{N}{k}} $$

If $N$ is even, then $\binom{N}{k}$ is increasing in $k$ for $0 \le k \le N/2$ and decreasing for larger $k$, so the expected return time is the smallest for $k = N/2$. If $N$ is odd the expected return times are smallest for the two values of $k$ on either side of $N/2$.

Expected First Passage Times

To get all the expected first passage times $E(W_j \mid X_0 = i)$, you can set up equations by first step analysis just as we did in the previous section.

For states $i \ne j$,

$$ E(W_j \mid X_0 = i) = 1 + \sum_{k \ne j} P(i, k)E(W_j \mid X_0 = k) $$

As you have seen, such systems of equations can be tedious to solve. If the transition matrix is numerical, you can use the mean_first_passage_times method to get all of the mean first passage times at once.

Here is an example using the Ehrenfest chain with $N=6$ so that we can see the entire matrix of mean first passage times.

N = 6

s = np.arange(N+1)

def trans_probs(i, j):
    if j == i:
        return 1/2
    if j == i+1:
        return (N-i)/(2*N)
    elif j == i-1:
        return i/(2*N)
    else:
        return 0

tbl = Table().states(s).transition_function(trans_probs)
ehrenfest = tbl.toMarkovChain()
ehrenfest.mean_first_passage_times()
0 1 2 3 4 5 6
0 64.0 2.000000 4.800000 9.2 17.600000 40.400000 166.4
1 126.0 10.666667 2.800000 7.2 15.600000 38.400000 164.4
2 148.8 22.800000 4.266667 4.4 12.800000 35.600000 161.6
3 157.2 31.200000 8.400000 3.2 8.400000 31.200000 157.2
4 161.6 35.600000 12.800000 4.4 4.266667 22.800000 148.8
5 164.4 38.400000 15.600000 7.2 2.800000 10.666667 126.0
6 166.4 40.400000 17.600000 9.2 4.800000 2.000000 64.0

The diagonal elements are the expected return times $E(W_i \mid X_0 = i)$ and are the inverses of the stationary binomial (6, 1/2) probabilities.

1/(ehrenfest.steady_state().column('Probability'))
array([ 64.        ,  10.66666667,   4.26666667,   3.2       ,
         4.26666667,  10.66666667,  64.        ])

The $(i, j)$th off-diagonal element is $E(W_j \mid X_0 = i)$. For example, if the chain starts at 2, you expect it to take $12.8$ steps to get to 4. The numerical value comes from solving the equations set up by first step analysis.

Notice the symmetry in the row corresponding to state 3, which is half-way between the empty and full states of Container 1. Starting at this half-way point, you expect the same amount of time (157.2 moves) for Container 1 to become empty as for it to become full.

 

results matching ""

    No results matching ""