(i) The chain is irreducible, aperiodic, positive recurrent(because finite chain). If π is a stationary distribution,
\begin{array}l
π_1=\frac13π_3\\
π_2=\frac12π_2+\frac13π_3\\
π_3=π_1+\frac12π_2+\frac13π_3\\
π_1+π_2+π_3=1
\end{array}
The unique solution is $(π_1, π_2, π_3)=\pmatrix{\frac16&\frac13&\frac12}$
By Convergence Theorem, for any initial distribution, the distribution at time $n$ converges to π as $n→∞$.
(ii) The chain is irreducible, 2-periodic, positive recurrent(because finite chain). If π is a stationary distribution,
\begin{array}l
π_1=π_2+π_3+π_4\\
π_2=\frac12π_1\\
π_3=\frac13π_1\\
π_4=\frac16π_1\\
π_1+π_2+π_3+π_4=1
\end{array}
The unique solution is $(π_1, π_2, π_3, π_4)=\pmatrix{\frac12&\frac14&\frac16&\frac1{12}}$
The distribution at time $2n$ is always (1 0 0 0).
The distribution at time $2n+1$ is a Markov chain, by Convergence Theorem, for any initial distribution, the distribution at time $2n+1$ converges to $\pmatrix{0&\frac12&\frac13&\frac16}$ as $n→∞$.
(iii) reducible. $\{1,4\}\{2,3\}\{5\}$ are commuting classes. $\{2,3\}\{5\}$ are closed. Limiting distribution depends on initial position. It is a convex combination of $\pmatrix{0&\frac12&\frac12&0&0}$ and $\pmatrix{0&0&0&0&1}$
To find $p_{12}^{(m)}$, let $h_i$ be the hitting probability of 2 starting from $i$. Then $h_5=0,h_2=h_3=1,h_1=\frac12h_2+\frac12h_4,h_4=\frac12h_1+\frac12h_5=0⇒h_1=\frac23⇒p_{12}^{(m)}→h_1 ×\frac12=\frac23 ×\frac12=\frac13$
A fair die is thrown repeatedly. Let $X_n$ denote the sum of the first $n$ throws. Find $\lim _{n→∞} ℙ(X_n$ is a multiple of 11).
State $i$ represents the current sum$\bmod 11$. Each throw, the value is added to the sum and enter a new state, so it is a Markov chain with transition matrix
\begin{bmatrix}
0&\frac16&\frac16&\frac16&\frac16&\frac16&\frac16&0&0&0&0\\
0&0&\frac16&\frac16&\frac16&\frac16&\frac16&\frac16&0&0&0\\
0&0&0&\frac16&\frac16&\frac16&\frac16&\frac16&\frac16&0&0\\
0&0&0&0&\frac16&\frac16&\frac16&\frac16&\frac16&\frac16&0\\
0&0&0&0&0&\frac16&\frac16&\frac16&\frac16&\frac16&\frac16\\
\frac16&0&0&0&0&0&\frac16&\frac16&\frac16&\frac16&\frac16\\
\frac16&\frac16&0&0&0&0&0&\frac16&\frac16&\frac16&\frac16\\
\frac16&\frac16&\frac16&0&0&0&0&0&\frac16&\frac16&\frac16\\
\frac16&\frac16&\frac16&\frac16&0&0&0&0&0&\frac16&\frac16\\
\frac16&\frac16&\frac16&\frac16&\frac16&0&0&0&0&0&\frac16\\
\frac16&\frac16&\frac16&\frac16&\frac16&\frac16&0&0&0&0&0
\end{bmatrix}
The chain is irreducible and aperiodic. Uniform distribution is a stationary distribution, so it is unique, and $\lim _{n→∞} ℙ(X_n\text{ is a multiple of 11})=\frac1{11}$.
A knight performs a random walk on a chessboard, making each possible move with equal probability at each step. If the knight starts in the bottom left corner, how many moves on average will it take to return there? (The knight's possible moves from two different positions are shown in the picture.)
Let $p_n$ be the probability that the knight is back in the same corner after $n$ steps. Describe the behaviour of $p_n$ as $n→∞$.
Consider the “random walk on a graph” from lectures, in which the equilibrium probability of a vertex is proportional to its number of neighbours. The number of 2×3 or 3×2 grid is 6×7=42. Each 2×3 or 3×2 grid gives 4 edges.
$$\sum_i d_i=42×2×4=336$$
At corner knight can make 2 legal moves so $d_1=2$. Therefore $π_1=1/168$.
From a black cell, knight always jump to a white cell; from a white cell, knight always jump to a black cell. So the chain is 2-periodic: $p_{2k+1}=0$
All the even-numbered states form a Markov chain. By Ergodic Theorem, $\frac1{\text{mean return time}}→\frac1{168}$. In the new chain, 1 step is 2 steps in the original chain, so $p_{2k}→\frac1{84}$.
A frog jumps on an infinite ladder. At each jump, with probability $1-p$ he jumps up
one step, while with probability $p$ he slips off and falls all the way to the bottom.
Represent the frog’s height on the ladder as a Markov chain; show that the stationary
distribution is geometric, and find its parameter. Use the ergodic theorem to obtain a
precise statement about the long-run proportion of time for which the frog is on the
first step above the bottom of the ladder.
If the frog has just fallen to the bottom, on average how many jumps will it take before
he next reaches step $k$?
[Hint: One approach is to consider the mean return time from $k$ to itself.]
Frog’s height is a Markov chain with transition probability $p_{i,i+1}=1-p,p_{i,0}=p$.
The stationary distribution $(π_0,π_1,…)$ satisfies $π_i=(1-p)π_{i-1}$ for $i>0$, so $π_i=(1-p)π_{i-1}$ and $π∼$Geom$(p)$. The chain is irreducible, let $V_i(n)$ be the number of times when the frog is on the first step before time $n$, by Ergodic Theorem, $V_i(n)\over n$ converges almost surely to $p(1-p)$. Denote by $k_i^{j}$ the mean time to go from step $i$ to step $j$. By the law of total expectation, we have the linear recurrence\[k_{i+1}^i=(1-p)\left(1+k_{i+2}^i\right)+p\left(1+k_0^i\right)⇔(1-p)(k_{i+2}^i-k_0^i)=(k_{i+1}^i-k_0^i)-1\]we get\[k_j^i=k_0^i+\frac1p\left(1-\frac1{(1-p)^j}\right)\]
Plugging in $j = i$ and $k_i^i = 0$ yields that $$k_0^i = \frac{1}{(1-p)^ip}-\frac{1}{p}$$
Markov chain: frog jumping on infinite ladder
Approach 2: You're right that $m_i$ should be the mean return time from $i$ to itself, but the frog has to move first, so it isn't equal to $0$. Instead, it has to fall to the ground ($k_i^0$ steps) and then go back up to $i$ ($k_0^i$ steps). This is why $m_i$ should equal $k_i^0 + k_0^i$. Since you know $k_i^0=\frac1p$ by the geometric distribution and $m_i=\frac1{\pi_i}=\frac1{(1-p)^ip}$, $k_0^i$ would be equal to $\frac1{(1-p)^ip}-\frac1p$.
A professor owns $N$ umbrellas. She walks to work each morning and back from work each evening. On each walk, if she has an umbrella available, she carries it if and only if it is raining. If it is raining and there is no umbrella available, she walks anyway and gets wet. Suppose it is raining independently on each walk with probability $p$. What is the long-run proportion of walks on which she gets wet?
Define a Markov chain to represent the number of umbrellas at home at the end of each day.
Transition probability $p_{0,0}=1-p,p_{0,1}=p,p_{N,N-1}=p(1-p),p_{N,N}=1-p+p^2$; for $1≤i≤N-1,p_{i,i}=p^2+(1-p)^2,p_{i,i+1}=p(1-p),p_{i,i-1}=p(1-p)$
If $π$ is the stationary distribution, then the relation $π_i=\sum_j p_{j i} π_j$ give
\begin{array}l
π_0=(1-p)π_1\\
π_i=\frac{π_{i-1}+π_{i+1}}2 \text{for} 1≤i≤N-1\\
π_N=π_{N-1}
\end{array}
∴ $π_1-π_0=⋯=π_{N-1}-π_{N-2}$
∴ $π_N-π_0=π_{N-1}-π_0=(N-1)(π_1-π_0)=\frac p{1-p}π_0$
∴ $π_0=(1-p)π_N$
∴ $π_1=π_N$
∴ $π_1=π_2=⋯=π_N$
∴ $1=π_0+π_1+⋯+π_N=π_0+\frac N{1-p}π_0$
∴ $π_0=\frac{1-p}{N+1-p}$
The long-run proportion of walks on which she gets wet is $π_0p+π_N(1-p)p=\frac{2p-2p^2}{N+1}$
Starting from some fixed time, requests at a web server arrive at an average rate of 2 per second, according to a Poisson process. Find:
(a) the probability that the first request arrives within 2 seconds.
(b) the distribution of the number of requests arriving within the first 5 seconds.
(c) the distribution of the arrival time of the $n$th request; give its mean and its variance (these should not require much calculation!).
(d) the approximate probability that more than 7250 requests arrive within the first hour.
(a) $N_2∼\operatorname{Poisson}(4),ℙ(N_2≥1)=1-e^{-4}$
(b) $N_5∼\operatorname{Poisson}(10)$
(c) $T_n∼\operatorname{Gamma}(n,2)$
$T_n=∑_{i=1}^nY_i$ where $Y_i\overset{\text{i.i.d}}∼\operatorname{Exp}(2)$, therefore
$𝔼[T_n]=n𝔼[Y_i]=\frac n2$
$\operatorname{Var}[T_n]=n\operatorname{Var}[Y_i]=\frac n4$
(d) By (c), $𝔼[T_{7250}]=3625,\operatorname{Var}[T_{7250}]=1812.5$, approximate $T_{7250}$ by normal distribution,
$$ℙ[T_{7250}≤3600]≈Φ\left(3600-3625\over\sqrt{1812.5}\right)=Φ(-0.58722)=0.2785$$
Arrivals of the Number 2 bus form a Poisson process of rate 2 per hour, and arrivals of the Number 7 bus form a Poisson process of rate 7 buses per hour, independently.
(a) What is the probability that exactly three buses pass by in an hour?
(b) What is the probability that the first Number 2 bus arrives before the first Number 7 bus?
(c) When the maintenance depot goes on strike, each bus breaks down independently with probability half before reaching my stop. In that case, what is the probability that I wait for 30 minutes without seeing a single bus?
(a) Denote the number of Number 2 buses in an hour by $X$ and Number 7 buses by $Y$.
$X∼$Poisson(2), $Y∼$Poisson(7) are independent, then $X+Y∼$Poisson(9)
$$ℙ(X+Y=3)=\frac{9^3}{3!}e^{-9}=0.015$$
(b) $X_1∼$Exp(2), $Y_1∼$Exp(7),
\[ℙ(X_1< Y_1)=\int_0^∞7e^{-7 y}dy\int_0^y2e^{-2 x}dx=\int_0^∞7e^{-7 y}(1-e^{-2y})dy=\frac29\]
(c) Denote the number of Number 2 buses in 30 minutes(=1/2 hour) by $X'$ and Number 7 buses by $Y'$.
By thinning $X'∼$Poisson(2/4), $Y'∼$Poisson(7/4)
By superposition $X'+Y'∼$Poisson(9/4)
$$ℙ(X'+Y'=0)=e^{-9/4}$$
direct calculation
\begin{align*}
ℙ(\text{no Number 2 bus})&=\sum_{n=0}^∞ℙ(n\text{ buses})ℙ(\text{all of them breakdown})\\
&=\sum_{n=0}^∞\frac{λ^n}{n!}e^{-λ}\left(1\over2\right)^n\\
&=e^{-λ/2}
\end{align*}
With $λ=9/2$ we get the probability is $e^{-9/4}$
Let $N_t$ be a Poisson process of rate $λ$. What is $ℙ\left(N_t=1\right)$ ? What is $ℙ\left(N_s=1∣N_t=1\right)$ for $0< s< t$ ? (Note carefully which properties of the Poisson process you are using.) Hence find the distribution of the time of the first point of the process, conditional on the event that exactly one point occurs in the interval $[0, t]$.
$N_t∼\text{Poisson}(λt),$
$$ℙ\left(N_t=1\right)=λte^{-λt}$$
Because the counts of events in an interval depends only on the length of the interval,
$$ℙ(N_{(0,s]}=1)=λse^{-λs} ℙ(N_{(s,t]}=0)=e^{-λ(t-s)}$$
Also the counts of Poisson point-events in disjoint intervals are independent.
$$ℙ(N_{(0,s]}=1∣N_{(0,t]}=1)=\frac{ℙ(N_{(0,s]}=1, N_{(s,t]}=0)}{ℙ(N_{(0,t]}=1)}=\frac{λse^{-λs}e^{-λ(t-s)}}{λte^{λt}}=\frac st$$
∴ The distribution of the time of the first point of the process, conditional on … is Uniform$[0, t]$
Let $N_t$ be a Poisson process of rate $λ$. Define $X_n=N_n-n$ for $n=0,1,2,…$
Explain why $X_n$ is a Markov chain and give its transition probabilities.
Use the strong law of large numbers to show that the chain is transient if $λ≠1$.
If $λ=1$, is the chain transient? null recurrent? positive recurrent? (Stirling's formula and the criterion for recurrence in terms of the sequence $p_{00}^{(n)}$ may help.)
$$X_{n+1}-X_n=N_{(n,n+1]}-1$$Since $N_{(n,n+1]}∼$Poisson(λ) are independent, $X_n$ is a Markov chain with transition probabilities\[ℙ(X_{n+1}=k+r-1∣X_n=k)=e^{-λ}\frac{λ^r}{r!}\]for $r≥0$; otherwise is 0.
Applying the strong law of large numbers to $X_n=∑_{i=1}^n(N_{(n,n+1]}-1)$,
$$\frac{X_n}n\overset{a.s.}⟶𝔼(N_{(n,n+1]}-1)=λ-1$$
∴ $X_n\overset{a.s.}⟶∞$ if $λ>1$, and $X_n\overset{a.s.}⟶-∞$ if $λ< 1$, ∴ the chain is transient.
Use the fact that a chain is recurrent if and only if $\sum_{n=1}^∞p_{ii}^{(n)}=∞$
\[p_{ii}^{(n)}=ℙ(N_{(i,i+n]}=n)=e^{-λn}\frac{(λn)^n}{n!}\]
If $λ=1$,
\[p_{ii}^{(n)}=\frac{\left(\frac{n}{e}\right)^n}{n!}\]
By Stirling's formula, $n!∼\sqrt{2πn}\left(n\over e\right)^n⇒p_{ii}^{(n)}∼\frac1{\sqrt n}⇒$the series diverges.
Positive recurrence ⇔ ∃! stationary distribution
To prove $π_0$ is not the unique stationary distribution, if $p_{00}^{(n)}→0$, then $p_{mm}^{(n)}→0$, then $\sum_{n=1}^∞p_{ii}^{(n)}=0$.
Let $T>1$. We observe a Poisson process of rate 1 on the time interval $(0, T)$. Each time a point occurs, we may decide to stop. Our goal is to stop at the last point which occurs before time $T$; if so, we win, and otherwise - i.e. if we never stop, or if we stop at some time $t$ but another point occurs in $(t, T)$ - we lose.
Find the best strategy you can for playing this game. What is its probability of winning?
Can you show that it's optimal? Feel free to argue informally (but convincingly!).
Strategy: Pick point after $t<T$ if it occurs
\[ℙ(\text{Strategy wins}∣t)=ℙ(\operatorname{Poisson}(T-t)=1)=(T-t)e^{-(T-t)}\]
\[\frac∂{∂t}ℙ(\text{Strategy wins}∣t)=0⇒t=T-1\]
Winning probability$=1-e^{-1}$