-
-
- Let $V$ be a random variable. Define its moment generating function $M_V$.
- Derive the moment generating functions of random variables $Y$ and $Z$, where
- $Y$ is Poisson distributed with parameter $Ī» ā(0, ā)$,
- and $Z$ is normally distributed with zero mean and variance $Ļ^2 ā(0, ā)$.
- Consider a random variable $Y$ taking values in the non-negative integers, that is independent of a sequence $Z_n, n ā©¾ 1$, of independent identically distributed random variables. Let
$$
R=\sum_{n=1}^Y Z_n
$$
with the convention that $R=0$ if $Y=0$.
- Show that $M_R(t)=G_Y\left(M_{Z_1}(t)\right)$, where $G_Y(s)=M_Y(\log (s))$.
- Suppose further that $Z_n$ is normally distributed with zero mean and variance $Ļ^2 ā(0, ā)$ and that $Y$ is Poisson distributed with parameter $Ī» ā(0, ā)$. Determine the moment generating function of $R$.
-
- Define the notion of convergence in distribution for real-valued random variables.
- State the convergence theorem for moment generating functions.
- Consider a sequence $R_n, n ā©¾ 1$, of independent random variables with the same distribution as $R$ in (b)(ii) and consider $S_n=R_1+āÆ+R_n, n ā©¾ 1$. Show that $S_n / \sqrt{n}$ converges in distribution. Determine the limiting distribution. If you use the Central Limit Theorem, you are expected to prove it.
- Let $Y$ be Poisson distributed with parameter $Ī» ā(0, ā)$. Let $q ā(0,1)$. Show that there is $c ā(0, ā)$ such that $ā(Y ā©¾ m) ā©½ c e^{-q m \log (m)}$ for all $m ā©¾ 1$.
-
-
-
- State, without proof, the Strong Law of Large Numbers.
- State carefully, without proof, the transformation formula for bivariate probability density functions.
-
- Let $X$ and $Y$ be independent exponentially distributed with parameter $Ī» ā(0, ā)$. Show that $W=X+Y$ and $U=X /(X+Y)$ are independent, and determine their marginal distributions.
- Let $Z$ be Gamma distributed with probability density function $z e^{-z}$ for $z>0$. Calculate $ā(Z>z)$ and $š¼(1 / Z)$.
- Let $Z$ be as in (ii) and $X_0$ independent exponentially distributed with parameter 1 . By evaluating $ā\left(X_0 / Z>v\right)$, or otherwise, find the probability density function of $V=X_0 / Z$.
- Consider a sequence $\left(X_n, n ā©¾ 0\right)$ of independent exponentially distributed random variables with parameter 1 . Let
$$
R_n=X_{n-1} /\left(X_n+X_{n+1}\right), ā n ā©¾ 1 .
$$
- For $n ā©¾ 1$, calculate the covariance of $R_n$ and $R_{n+2}$.
- Show that $$ \frac{1}{n} \sum_{k=1}^n R_k ā 1 ā \text { almost surely, as } n ā ā \text {. } $$
-
Solution
-
-
- $M_V: ā ā[0, ā]$ is given by $M_V(t)=š¼\left(e^{t V}\right)$.
- $M_Y(t)=\sum_{n=0}^{ā} e^{t n} \frac{Ī»^n}{n !} e^{-Ī»}=\exp \left(Ī»\left(e^t-1\right)\right)$ by the exponential series. Also, \begin{aligned} M_Z(t) &=ā«_{-ā}^{ā} e^{t z} \frac{1}{\sqrt{2 Ļ Ļ^2}} e^{-z^2 / 2 Ļ^2} d z \\ &=e^{Ļ^2 t^2 / 2} ā«_{-ā}^{ā} \frac{1}{\sqrt{2 Ļ Ļ^2}} e^{-\left(z-t Ļ^2\right)^2 / 2 Ļ^2}=e^{Ļ^2 t^2 / 2} \end{aligned} as the pdf of the normal distribution with parameters $t Ļ^2$ and $Ļ^2$ integrates to 1.
-
- Conditioning on $Y$, we get \begin{aligned} š¼\left(e^{t R}\right) &=\sum_{n=0}^{ā} š¼\left(\exp \left(t \sum_{k=1}^n Z_k\right)\right) ā(Y=n) \\ &=\sum_{n=0}^{ā}\left(M_{Z_1}(t)\right)^n ā(Y=n)=G_Y\left(M_{Z_1}(t)\right) \end{aligned} by independence of $Z_1, ā¦, Z_{\mathrm{n}}$ for each $n ā„ 1$, and where $$ G_Y(s)=š¼\left(s^Y\right)=š¼\left(e^{Y \log (s)}\right)=M_Y(\log (s)) $$
- By (i) and (a)(ii), $M_R(t)=\exp \left(Ī»\left(e^{Ļ^2 t^2 / 2}-1\right)\right)$.
-
- $W_n ā W$ in distribution if $ā\left(W_n ā¤ w\right) ā ā(W ā¤ w)$ for all $w ā ā$ with $ā(W=w)=0$.
- If there is $t_0>0$ such that $M_{W_n}(t), n ā„ 1$, and $M_W(t)$ are finite for all $t ā\left(-t_0, t_0\right)$, and if $M_{W_n}(t) ā M_W(t)$ for all $t ā\left(-t_0, t_0\right)$, then $W_n ā W$ in distribution.
- By (b)(ii) and by independence, we have for all $t ā ā$ $$ \log \left(M_{S_n / \sqrt{n}}(t)\right)=\log \left(\left(M_R(t / \sqrt{n})\right)^n\right)=n Ī»\left(e^{Ļ^2 t^2 / 2 n}-1\right) $$ and Taylor expansion yields $$ \log \left(M_{S_n / \sqrt{n}}(t)\right)=Ī» Ļ^2 t^2 / 2+o(1) ā Ī» Ļ^2 t^2 / 2 $$ which is the logarithm of the moment generating function of the normal distribution with zero mean and variance $Ī» Ļ^2$, by (a)(ii). By the convergence theorem in (ii), we conclude that $S_n / \sqrt{n}$ converges in distribution to this normal distribution.
- This is new. Students have seen bounds for SRW based on the same technique. We apply Markov's inequality and the moment generating function of $Y$ from (a)(ii): $$ ā(Y ā„ m)=ā\left(e^{t Y} ā„ e^{t m}\right) ā¤ e^{-t m+Ī»\left(e^t-1\right)} ā \text { for all } t ā„ 0 $$ Optimising the exponent over $t ā„ 0$, we find the (minimal) bound when $$ m=Ī» e^t āŗ e^t=m / Ī» āŗ t=\log (m / Ī»)=\log (m)-\log (Ī») . $$ Hence $ā(Y ā„ m) ā¤ e^{-m \log (m)+m \log (Ī»)+m-Ī»} ā¤ e^{-q m \log (m)}$ for $m$ sufficiently large, and a suitable factor $c ā[1, ā)$ suffices to also upper bound for the finitely many smaller $m$.
-
-
-
- Let $X_n, n ā„ 1$, be independent identically distributed with mean $Ī¼=š¼\left(X_1\right)$. Then $ā\left(n^{-1} \sum_{1 ā¤ k ā¤ n} X_k ā Ī¼\right)=1$.
- Let $D, R ā ā^2$ and $T: D ā R$ a bijective transformation whose inverse is continuously differentiable with $$ J(u, v)=\frac{ā x}{ā u} \frac{ā y}{ā v}-\frac{ā x}{ā v} \frac{ā y}{ā u} $$ If $(X, Y)$ has joint pdf $f_{X, Y}: D ā[0, ā)$, then $(U, V)=T(X, Y)$ is jointly continuous with pdf $f_{U, V}(u, v)=f_{X, Y}(x(u, v), y(u, v)){|J(u, v)|},(u, v) ā R$.
-
- The transformation formula of (a)(ii) applies with $D=(0, ā)^2, R=(0, ā) Ć(0,1), f_{X, Y}(x, y)=Ī»^2 e^{-Ī»(x+y)}$. The inverse of $T(x, y)=(x+y, x /(x+y))$ is $T^{-1}(w, u)=(w u, w(1-u))$ with Jacobian $J(w, u)=-u w-w(1-u)=-w$, so $$ f_{W, U}(w, u)=Ī»^2 e^{-Ī» w} w, ā w ā(0, ā), u ā(0,1) $$ This factorises into $f_W(w)=Ī»^2 w e^{-Ī» w}$ and $f_U(u)=1$, i.e. $W$ and $U$ are independent with $Wā¼\operatorname{Gamma}(2, Ī»)$ and $Uā¼\operatorname{Uniform}(0,1)$.
- $ā(Z>z)=ā«_z^{ā} w e^{-w} d w=\left[-w e^{-w}\right]_z^{ā}+ā«_z^{ā} e^{-w} d w=e^{-z}(1+z), z ā„ 0$. $š¼(1 / Z)=ā«_0^{ā}(1 / z) z e^{-z} d z=1$
- $ā\left(X_0 / Z>v\right)=ā«_0^{ā} ā«_{v z}^{ā} e^{-x} z e^{-z} d x d z=ā«_0^{ā} z e^{-z(1+v)} d z=(1+v)^{-2}$, $v>0$, by comparison with the Gamma$(2,1+v)$ pdf. By differentiation, $V=X_0 / Z$ has pdf $f_V(v)=2(1+v)^{-3}, v>0$.
- By (b)(i), $X_1+X_2$ has the distribution of $Z$ for $Ī»=1$. By (b)(ii), $š¼(1 / Z)=1$. By independence, $$ š¼\left(R_1\right)=š¼\left(\frac{X_0}{X_1+X_2}\right)=š¼\left(X_0\right) š¼\left(\frac{1}{Z}\right)=1 $$ By (b)(i), $X_2 /\left(X_1+X_2\right)$ has the uniform distribution of $U$ with mean $1 / 2$. Hence $$ \operatorname{Cov}\left(R_n, R_{n+2}\right)=š¼\left(\frac{X_0 X_2}{\left(X_1+X_2\right)\left(X_3+X_4\right)}\right)-1=š¼\left(X_0\right) š¼(U) š¼(1 / Z)-1=-1 / 2 \text {. } $$
- Note that $R_{3 n}$ depends on $X_{3 n-1}, X_{3 n}, X_{3 n+1}$, so $R_{3 n}, n ā„ 1$, are independent with $š¼\left(R_{3 n}\right)=1$, by (i). By the SLLN of (a)(i), $$ \frac{1}{n} \sum_{k=1}^n R_{3 k} ā 1 ā \text { with probability } 1 $$ Similarly, we have $$ \frac{1}{n} \sum_{k=1}^n R_{3 k-1} ā 1 \text { and } \frac{1}{n} \sum_{k=1}^n R_{3 k-2} ā 1 ā \text { with probability } 1 $$ By Algebra of Limits for almost sure convergence, we have, with probability 1 , $$ \frac{1}{3 n} \sum_{k=1}^{3 n} R_k=\frac{1}{3 n} \sum_{k=1}^n R_{3 k-2}+\frac{1}{3 n} \sum_{k=1}^n R_{3 k-1}+\frac{1}{3 n} \sum_{k=1}^n R_{3 k} ā 1 . $$ A simple sandwiching argument allows us to replace $3 n$ by $n$.
-
PREVIOUSProbability exercises