Assignment 5
In light of your midterm on 2024-02-21, this homework is not due. Many of the problems cover material on the midterm and are good practice. Some of the problems will be on your regular homework (due 2024-02-28).
Question 1
Let $0 = t_0 < t_1 < t_2 \cdots$ be an increasing sequence of times, and $D$ be an adapted process. Given $T \geq 0$, let $n \in \N$ be the unique number such that $T \in [t_n, t_{n+1})$, and define \begin{equation} I_P(T) = \sum_{k =0}^{n-1} D_{t_i} (W_{t_{i+1}} - W_{t_i}) + D_{t_n} (W_T - W_{t_n}) \,. \end{equation} Show that $I_P$ is a martingale.
[Note: We showed in class that if $s = t_m < t_n = t$ then $\E_s I_P(t) = I_P(s)$. For this question you need to show $\E_s I_P(t) = I_P(s)$ without assuming $s = t_m$ and $t=t_n$.]
Question 2
Express the process $X_t = e^{-t W_t^2}$ as the sum of a martingale and a process of finite first variation.
Question 3
Let $\displaystyle M_t = \int_0^t s W_s \, ds$. Find $\displaystyle \E \paren[\big]{M_t^2 - \qv{M}_t}$.
Question 4
Suppose $\sigma_t$ is a non-random function. Find the distribution of the process $\displaystyle X_t = \int_0^t \sigma_s \, dW_s$.
[Hint: Compute the characteristic function.]
Question 5
Suppose $b_t$ is a non-random function. Find the distribution of the process $\displaystyle B_t = \int_0^t b_s \, W_s \, ds$.
[Hint: Don’t compute the characteristic function.]
Question 6
-
Suppose $(X_1, X_2)$ is jointly Gaussian with $\E X_i = 0$, $\E X_i^2 = \sigma_i^2$, and $\E X_1 X_2 = \rho$. Find $\E \paren{ X_1 \given X_2}$ (recall from your previous homework that $\E\paren{X_1 \given X_2}$ is shorthand for $\E\paren{X_1 \given \sigma(X_2)}$). Express your answer in the form $g(X_2)$, where $g$ is some function you have an explicit formula for.
[Hint: Let $Y = X_1 - \alpha X_2$, and choose $\alpha \in \R$ so that $\E Y X_2 = 0$. By the normal correlation theorem we know $Y$ is independent of $X_2$. Now use the fact that $X_1 = Y + \alpha X_2$ to compute $\E( X_1 \given X_2 )$.]
-
Use the previous part to compute $\E( W_s \given W_t )$ when $s < t$.
[This was asked in a job interview.]
Question 7
Let $\alpha, \sigma \in \R$ and define $\displaystyle S_t = S_0 \exp\paren[\Big]{ \paren[\Big]{\alpha - \frac{\sigma^2}{2}} t + \sigma W_t }$.
-
Given a function $f\colon \R \to \R$, find a function $g\colon \R \to \R$ so that \begin{equation*} \E \paren[\big]{ f(S_t) \given \mathcal F_s } = g(S_s)\,. \end{equation*} Your formula for $g$ will involve $f$ and an integral involving the density of the normal distribution.
[Hint: Let $Y = \exp( (\alpha - \frac{\sigma^2}{2}) (t - s) + \sigma (W_t - W_s) )$, and note $S_t = S_s Y$ where $S_s$ is $\mathcal F_s$ measurable and $Y$ is independent of $\mathcal F_s$. Use this to compute $\E \paren{ f(S_s Y) \given \mathcal F_s}$.
-
Find functions $f, g\colon \R^2 \to \R$ so that \begin{equation*} S_t = S_0 + \int_0^t f(s, S_s) \, ds + \int_0^t g(s, S_s) \, dW_s\,. \end{equation*}
[Hint: Use the Itô formula to compute $dS_t = S_0 \, d( \exp(\cdots) )$. If you get the right answer you’ll realize the importance of the process $S$ to financial mathematics. The fact that I called it $S$ and not $X$ might have already given you a clue…
-
Using the previous part find all $\alpha \in \R$ for which $S$ is a martingale?
-
Let $\mu_t = \E S_t$. Find a function $h$ so that $\partial_t \mu_t = h( t, \mu_t)$.
[Note: You can do this directly using the formula for $S$, of course. But it might be easier (and more instructive) to use your answer to part 2 instead.
-
Find a function $h$ so that $\displaystyle \qv{S}_t = \int_0^t h(s, S_s) \, ds$.
Question 8
In the previous question we observe that if we apply any function $f$ to the process $S$ at time $t$ and condition it on $\mathcal F_s$, the whole history up to time $s$, we get something that only depends on $S_s$ (the ``state’’ at time $s$) and not anything before. This is called the Markov property. Explicitly, a process $X$ is called Markov if for any function $f\colon \R \to \R$ and any $s < t$ we have $\E( f(X_t) \given \mathcal F_s ) = g(X_s)$ for some function $g$.
Is Brownian motion a Markov process? Justify.