Assignment 3

Assigned 2024-09-10, due 2024-09-17.

The written and coding portions of this assignment are two separate assignments on Gradescope. Either typeset or scan your solution to the written portion, and upload it. For the coding portion, you should upload a Jupyter notebook with all output included. Make sure you have clear sections for each question.

For all questions on Markov Chains, we will implicitly assume that the state space $\mathcal X$ is finite, and the chain is time homogeneous.

Question 1

Suppose $X$ is an aperiodic Markov chain. Show that there exists $N \in \N$ such that for all $n \geq N$ and $x \in \mathcal X$ we have $P^n(x, x) > 0$.

Note: Here’s an elementary fact about GCD’s you may find handy. If $S \subseteq \N$ is non-empty and $\gcd(S) = 1$, then there exists $N \in \N$ such that for every $n \geq N$ there exists $k \in \N$, $s_1, \dots, s_k \in S$ and $m_1, \dots, m_k \in \N$ such that $n = \sum_{i=1}^k m_i s_i$.

Question 2

Let $P(x, y)$ be the transition matrix of a Markov chain, and $v \colon \mathcal X \to \C$ be any function. If $v$ is a left eigenvector of $P$ with eigenvalue $\lambda$ show that $\abs{\lambda} \leq 1$. Recall, $v \neq 0$ is a left eigenvector of $P$ if $vP = \lambda v$. That is, for every $y \in \mathcal X$ we have \begin{equation} \sum_{x \in \mathcal X} v(x) P(x, y) = \lambda v(y) \,. \end{equation}

Question 3

Let $G = (P, E)$ be a finite connected graph, and consider the following Markov chain: Given $X_n = x$, we flip an (independent) coin that lands heads with probability $p \in (0, 1)$. If the coin lands heads we jump to a neighbor of $x$, chosen uniformly at random. Otherwise we stay at $x$. (Note: The graph isn’t regular – that is, different points on the graph may have different degrees.)

  1. Decide whether or not this chain is irreducible, and prove it.

  2. Find the stationary distribution of this chain. (You don’t have to show how you found the stationary distribution; but you do have to verify that the distribution you found is stationary.)

Question 4

Implement a MCMC algorithm sampling from the Gibbs distribution in the Ising model. Assume $J = 1$ and $B = 0$. You might find it useful to download this notebook used to generate all the figures on this page. The function implementing the MCMC algorithm has been removed from the notebook. If you implement it correctly and run it, you should get output that looks like this.

When turning in your assignment you should turn in the whole notebook, with all output included except the video. You can comment out, or remove the video from your notebook.