Assignment 2
Assigned 2024-01-24, due 2024-01-31 at noon on Gradescope
Question 1
Let $X, Y$ be two random variables.
-
Show by example that if both $X$ and $Y$ are normally distributed and uncorrelated, then they need not be independent.
-
Show that if $(X, Y)$ is jointly normal, and $X$ and $Y$ are uncorrelated, then $X$ and $Y$ are independent.
Question 2
-
Suppose $X_n$ is a sequence of identically distributed random variables which are not necessarily independent. Suppose further $\E |X_1| \lt \infty$, $\E |X_1|^2 \lt \infty$ and $$ \sum_{m = 1}^\infty \sum_{n = 1}^{m-1} |\operatorname{cov}(X_m, X_n)| \lt \infty . $$ Show that for every $\epsilon \gt 0$ \begin{equation} \lim_{N \to \infty} \P \Bigl( \Bigl| \frac{1}{N} \sum_{n=1}^N X_n - \E X_1 \Bigr| \gt \epsilon \Bigr) = 0 \,. \end{equation} If possible, also find the rate at which the above converges to $0$ as $N \to \infty$.
Hint: In your first probability class you should have seen a proof of the weak law of large numbers assuming the variance is finite. If you’re having trouble with this question, remind yourself of that proof (there’s one on Wikipedia, in Feller, or any standard reference). Then see if you can adapt it to this situation.
-
Suppose now $X_n$ is an independent identically distributed sequence of random variables with $\E \abs{X_1} < \infty$. (The variance of $X_1$, however, may be infinite.) Let $S_N = \sum_{n=1}^N X_n$. Must $S_N / N$ converge in distribution? If yes, prove it. If no, find a counter example.
Hint: Remember convergence in distribution is equivalent to pointwise convergence of the characteristic functions.
Question 3
Let $W$ be a Brownian motion, $0 \leq s < t$. Show that $(W_s, W_t)$ is jointly normal and find the covariance matrix.
Question 4
-
(Chebychev’s inequality) For any $p, \lambda > 0$, prove $\P( X > \lambda ) \leq \E (\abs{X}^p) / \lambda^p$.
Hint: For $p = 1$, verify and use the fact that $\lambda \one_{\set{X > \lambda}} \leq \abs{X}$.
-
(Jensen’s inequality) If $\varphi: \R \to \R$ is a convex function, $t\geq 0$, and $X$ is a random variable, show that $\varphi( \E_t X ) \leq \E_t \varphi(X)$.
Hint: Use the fact that convex functions are always above their tangent. Namely, for any $a, x \in \R$, we have $\varphi(a) + (x - a)\varphi’(a) \leq \varphi(x)$. If this hint isn’t sufficient, this should be done in most standard references.