Probability paper 2020

 
      1. Let $V$ be a random variable. Define its moment generating function $M_V$.
      2. Derive the moment generating functions of random variables $Y$ and $Z$, where
        • $Y$ is Poisson distributed with parameter $Ī» āˆˆ(0, āˆž)$,
        • and $Z$ is normally distributed with zero mean and variance $Ļƒ^2 āˆˆ(0, āˆž)$.
    1. Consider a random variable $Y$ taking values in the non-negative integers, that is independent of a sequence $Z_n, n ā©¾ 1$, of independent identically distributed random variables. Let $$ R=\sum_{n=1}^Y Z_n $$ with the convention that $R=0$ if $Y=0$.
      1. Show that $M_R(t)=G_Y\left(M_{Z_1}(t)\right)$, where $G_Y(s)=M_Y(\log (s))$.
      2. Suppose further that $Z_n$ is normally distributed with zero mean and variance $Ļƒ^2 āˆˆ(0, āˆž)$ and that $Y$ is Poisson distributed with parameter $Ī» āˆˆ(0, āˆž)$. Determine the moment generating function of $R$.
      1. Define the notion of convergence in distribution for real-valued random variables.
      2. State the convergence theorem for moment generating functions.
      3. Consider a sequence $R_n, n ā©¾ 1$, of independent random variables with the same distribution as $R$ in (b)(ii) and consider $S_n=R_1+ā‹Æ+R_n, n ā©¾ 1$. Show that $S_n / \sqrt{n}$ converges in distribution. Determine the limiting distribution. If you use the Central Limit Theorem, you are expected to prove it.
    2. Let $Y$ be Poisson distributed with parameter $Ī» āˆˆ(0, āˆž)$. Let $q āˆˆ(0,1)$. Show that there is $c āˆˆ(0, āˆž)$ such that $ā„™(Y ā©¾ m) ā©½ c e^{-q m \log (m)}$ for all $m ā©¾ 1$.
      1. State, without proof, the Strong Law of Large Numbers.
      2. State carefully, without proof, the transformation formula for bivariate probability density functions.
      1. Let $X$ and $Y$ be independent exponentially distributed with parameter $Ī» āˆˆ(0, āˆž)$. Show that $W=X+Y$ and $U=X /(X+Y)$ are independent, and determine their marginal distributions.
      2. Let $Z$ be Gamma distributed with probability density function $z e^{-z}$ for $z>0$. Calculate $ā„™(Z>z)$ and $š”¼(1 / Z)$.
      3. Let $Z$ be as in (ii) and $X_0$ independent exponentially distributed with parameter 1 . By evaluating $ā„™\left(X_0 / Z>v\right)$, or otherwise, find the probability density function of $V=X_0 / Z$.
    1. Consider a sequence $\left(X_n, n ā©¾ 0\right)$ of independent exponentially distributed random variables with parameter 1 . Let $$ R_n=X_{n-1} /\left(X_n+X_{n+1}\right), ā€ƒ n ā©¾ 1 . $$
      1. For $n ā©¾ 1$, calculate the covariance of $R_n$ and $R_{n+2}$.
      2. Show that $$ \frac{1}{n} \sum_{k=1}^n R_k ā†’ 1 ā€ƒ \text { almost surely, as } n ā†’ āˆž \text {. } $$

Solution

      1. $M_V: ā„ ā†’[0, āˆž]$ is given by $M_V(t)=š”¼\left(e^{t V}\right)$.
      2. $M_Y(t)=\sum_{n=0}^{āˆž} e^{t n} \frac{Ī»^n}{n !} e^{-Ī»}=\exp \left(Ī»\left(e^t-1\right)\right)$ by the exponential series. Also, \begin{aligned} M_Z(t) &=āˆ«_{-āˆž}^{āˆž} e^{t z} \frac{1}{\sqrt{2 Ļ€ Ļƒ^2}} e^{-z^2 / 2 Ļƒ^2} d z \\ &=e^{Ļƒ^2 t^2 / 2} āˆ«_{-āˆž}^{āˆž} \frac{1}{\sqrt{2 Ļ€ Ļƒ^2}} e^{-\left(z-t Ļƒ^2\right)^2 / 2 Ļƒ^2}=e^{Ļƒ^2 t^2 / 2} \end{aligned} as the pdf of the normal distribution with parameters $t Ļƒ^2$ and $Ļƒ^2$ integrates to 1.
      1. Conditioning on $Y$, we get \begin{aligned} š”¼\left(e^{t R}\right) &=\sum_{n=0}^{āˆž} š”¼\left(\exp \left(t \sum_{k=1}^n Z_k\right)\right) ā„™(Y=n) \\ &=\sum_{n=0}^{āˆž}\left(M_{Z_1}(t)\right)^n ā„™(Y=n)=G_Y\left(M_{Z_1}(t)\right) \end{aligned} by independence of $Z_1, ā€¦, Z_{\mathrm{n}}$ for each $n ā‰„ 1$, and where $$ G_Y(s)=š”¼\left(s^Y\right)=š”¼\left(e^{Y \log (s)}\right)=M_Y(\log (s)) $$
      2. By (i) and (a)(ii), $M_R(t)=\exp \left(Ī»\left(e^{Ļƒ^2 t^2 / 2}-1\right)\right)$.
      1. $W_n ā†’ W$ in distribution if $ā„™\left(W_n ā‰¤ w\right) ā†’ ā„™(W ā‰¤ w)$ for all $w āˆˆ ā„$ with $ā„™(W=w)=0$.
      2. If there is $t_0>0$ such that $M_{W_n}(t), n ā‰„ 1$, and $M_W(t)$ are finite for all $t āˆˆ\left(-t_0, t_0\right)$, and if $M_{W_n}(t) ā†’ M_W(t)$ for all $t āˆˆ\left(-t_0, t_0\right)$, then $W_n ā†’ W$ in distribution.
      3. By (b)(ii) and by independence, we have for all $t āˆˆ ā„$ $$ \log \left(M_{S_n / \sqrt{n}}(t)\right)=\log \left(\left(M_R(t / \sqrt{n})\right)^n\right)=n Ī»\left(e^{Ļƒ^2 t^2 / 2 n}-1\right) $$ and Taylor expansion yields $$ \log \left(M_{S_n / \sqrt{n}}(t)\right)=Ī» Ļƒ^2 t^2 / 2+o(1) ā†’ Ī» Ļƒ^2 t^2 / 2 $$ which is the logarithm of the moment generating function of the normal distribution with zero mean and variance $Ī» Ļƒ^2$, by (a)(ii). By the convergence theorem in (ii), we conclude that $S_n / \sqrt{n}$ converges in distribution to this normal distribution.
    1. This is new. Students have seen bounds for SRW based on the same technique. We apply Markov's inequality and the moment generating function of $Y$ from (a)(ii): $$ ā„™(Y ā‰„ m)=ā„™\left(e^{t Y} ā‰„ e^{t m}\right) ā‰¤ e^{-t m+Ī»\left(e^t-1\right)} ā€ƒ \text { for all } t ā‰„ 0 $$ Optimising the exponent over $t ā‰„ 0$, we find the (minimal) bound when $$ m=Ī» e^t āŸŗ e^t=m / Ī» āŸŗ t=\log (m / Ī»)=\log (m)-\log (Ī») . $$ Hence $ā„™(Y ā‰„ m) ā‰¤ e^{-m \log (m)+m \log (Ī»)+m-Ī»} ā‰¤ e^{-q m \log (m)}$ for $m$ sufficiently large, and a suitable factor $c āˆˆ[1, āˆž)$ suffices to also upper bound for the finitely many smaller $m$.
      1. Let $X_n, n ā‰„ 1$, be independent identically distributed with mean $Ī¼=š”¼\left(X_1\right)$. Then $ā„™\left(n^{-1} \sum_{1 ā‰¤ k ā‰¤ n} X_k ā†’ Ī¼\right)=1$.
      2. Let $D, R āŠ† ā„^2$ and $T: D ā†’ R$ a bijective transformation whose inverse is continuously differentiable with $$ J(u, v)=\frac{āˆ‚ x}{āˆ‚ u} \frac{āˆ‚ y}{āˆ‚ v}-\frac{āˆ‚ x}{āˆ‚ v} \frac{āˆ‚ y}{āˆ‚ u} $$ If $(X, Y)$ has joint pdf $f_{X, Y}: D ā†’[0, āˆž)$, then $(U, V)=T(X, Y)$ is jointly continuous with pdf $f_{U, V}(u, v)=f_{X, Y}(x(u, v), y(u, v)){|J(u, v)|},(u, v) āˆˆ R$.
      1. The transformation formula of (a)(ii) applies with $D=(0, āˆž)^2, R=(0, āˆž) Ɨ(0,1), f_{X, Y}(x, y)=Ī»^2 e^{-Ī»(x+y)}$. The inverse of $T(x, y)=(x+y, x /(x+y))$ is $T^{-1}(w, u)=(w u, w(1-u))$ with Jacobian $J(w, u)=-u w-w(1-u)=-w$, so $$ f_{W, U}(w, u)=Ī»^2 e^{-Ī» w} w, ā€ƒ w āˆˆ(0, āˆž), u āˆˆ(0,1) $$ This factorises into $f_W(w)=Ī»^2 w e^{-Ī» w}$ and $f_U(u)=1$, i.e. $W$ and $U$ are independent with $Wāˆ¼\operatorname{Gamma}(2, Ī»)$ and $Uāˆ¼\operatorname{Uniform}(0,1)$.
      2. $ā„™(Z>z)=āˆ«_z^{āˆž} w e^{-w} d w=\left[-w e^{-w}\right]_z^{āˆž}+āˆ«_z^{āˆž} e^{-w} d w=e^{-z}(1+z), z ā‰„ 0$. $š”¼(1 / Z)=āˆ«_0^{āˆž}(1 / z) z e^{-z} d z=1$
      3. $ā„™\left(X_0 / Z>v\right)=āˆ«_0^{āˆž} āˆ«_{v z}^{āˆž} e^{-x} z e^{-z} d x d z=āˆ«_0^{āˆž} z e^{-z(1+v)} d z=(1+v)^{-2}$, $v>0$, by comparison with the Gamma$(2,1+v)$ pdf. By differentiation, $V=X_0 / Z$ has pdf $f_V(v)=2(1+v)^{-3}, v>0$.
      1. By (b)(i), $X_1+X_2$ has the distribution of $Z$ for $Ī»=1$. By (b)(ii), $š”¼(1 / Z)=1$. By independence, $$ š”¼\left(R_1\right)=š”¼\left(\frac{X_0}{X_1+X_2}\right)=š”¼\left(X_0\right) š”¼\left(\frac{1}{Z}\right)=1 $$ By (b)(i), $X_2 /\left(X_1+X_2\right)$ has the uniform distribution of $U$ with mean $1 / 2$. Hence $$ \operatorname{Cov}\left(R_n, R_{n+2}\right)=š”¼\left(\frac{X_0 X_2}{\left(X_1+X_2\right)\left(X_3+X_4\right)}\right)-1=š”¼\left(X_0\right) š”¼(U) š”¼(1 / Z)-1=-1 / 2 \text {. } $$
      2. Note that $R_{3 n}$ depends on $X_{3 n-1}, X_{3 n}, X_{3 n+1}$, so $R_{3 n}, n ā‰„ 1$, are independent with $š”¼\left(R_{3 n}\right)=1$, by (i). By the SLLN of (a)(i), $$ \frac{1}{n} \sum_{k=1}^n R_{3 k} ā†’ 1 ā€ƒ \text { with probability } 1 $$ Similarly, we have $$ \frac{1}{n} \sum_{k=1}^n R_{3 k-1} ā†’ 1 \text { and } \frac{1}{n} \sum_{k=1}^n R_{3 k-2} ā†’ 1 ā€ƒ \text { with probability } 1 $$ By Algebra of Limits for almost sure convergence, we have, with probability 1 , $$ \frac{1}{3 n} \sum_{k=1}^{3 n} R_k=\frac{1}{3 n} \sum_{k=1}^n R_{3 k-2}+\frac{1}{3 n} \sum_{k=1}^n R_{3 k-1}+\frac{1}{3 n} \sum_{k=1}^n R_{3 k} ā†’ 1 . $$ A simple sandwiching argument allows us to replace $3 n$ by $n$.