Assignment 3

Assigned 2024-01-31, due 2024-02-07 at noon on Gradescope

Question 1

  1. If $X$ is a continuous random variable with density~$p$, we know $\E X = \int_{-\infty}^\infty x p(x) \, dx$. If $X$ is also nonnegative, use the above formula to derive the layer cake formula \begin{equation} \E X = \int_0^\infty \P(X \geq t) \, dt\,, \end{equation} in this special case.

  2. Let $X$ be a nonnegative random variable (which may or may not have a density), and let $\varphi$ be a differentiable, nonnegative, increasing function with $\varphi(0) = 0$. Use the layer cake formula to show that \begin{equation*} \E \varphi(X) = \int_0^\infty \varphi’(t) \, \P(X \geq t) \, dt\,. \end{equation*}

  3. Is this formula still valid if $\varphi(0) \neq 0$?

Question 2

  1. If $s < t$, compute $\E_s W_t^4$.

  2. Given $\lambda \in \R$, find $\alpha$ so that the process $M_t = \exp(\lambda W_t - \alpha t)$ is a martingale.

Question 3

True or false: If $M$ is a martingale and $s < t$, then $\E(M_t - M_s)^2 = \E M_t^2 - \E M_s^2$. Prove it, or find a counter example.

Question 4

Let $Y$ be a standard normal random variable, and let $K \in \R$.

  1. For any $x \in \R$ let $g(x) \defeq \E\paren{ (e^{(x+Y)}-K)^+ }$. Express $g$ explicitly in terms of the cumulative normal distribution function \begin{equation*} N(d) \defeq \frac{1}{\sqrt{2\pi}} \int_{-\infty}^d e^{-\frac12\xi^2}\,d\xi, \end{equation*} for two different values of $d$.

    [Note: Your answer will look something like the Black-Scholes formula.]

  2. Suppose now $X$ is another standard normal random variable that is independent of $Y$. Compute $\E \paren[\big]{ (e^{X+Y} - K)^+ \given[\big] X }(\omega)$.

    [Even though the variable $\omega$ is usually suppressed from all formulae, include it explicitly in this problem for clarity. Here $\E \paren{ (e^{X+Y} - K)^+ \given X }$ is shorthand for $\E \paren{ (e^{X+Y} - K)^+ \given \sigma(X) }$, where $\sigma(X)$ is $\sigma$-algebra generated by all events that can be observed using only the random variable $X$. As a result, when computing this conditional expectation, you can treat $Y$ as independent of $\sigma(X)$ and $X$ as measurable with respect to $\sigma(X)$.]