Midterm (Monte Carlo Methods and Applications)
Assigned 2024-10-01, due 2024-10-08.
-
This midterm is not collaborative. You must solve this on your own, without taking help from anyone else (in person, or online).
-
You may use whatever resources you can find online or in books; however, you may not post on discussion boards.
-
If you use AI, please acknowledge it. (I do not think it will help you; I’d wager it will spit out something that looks correct, but when checked carefully has a fundamental flaw. I’m not forbidding it on this exam; but if you use it please acknowledge exactly where you used it.)
-
There is no late day for this exam. It must be turned in at 11:30AM on Oct 8th.
Question 1.
Define $R \subseteq \R^2$ by \begin{equation} R \defeq \set{ (x, y) \st x^4 + y^4 < 1 } \,. \end{equation} Describe one algorithm that can be used to sample from the uniform distribution on the region $R$. (You may assume you can already sample from standard distributions.) Your algorithm should be explicit and detailed enough that it can be implemented easily.
Question 2
Let $G = (V, E)$ be a finite graph, and $N \geq 2$. A coloring of $G$ is a function $\sigma \colon V \to \set{1, \dots, N} $. (For $v \in V$, we think of $\sigma(v)$ as the color of the vertex $v$.) The cost of a coloring is proportional to the number of of times neighboring vertices have the same color. Explicitly, given a coloring $\sigma \colon V \to \set{1, \dots, N}$ we define the cost $C(\sigma)$ by \begin{equation} C(\sigma) = \sum_{v \in V} \sum_{w \in N(v)} \delta( \sigma(v), \sigma(w) ) \,, \end{equation} where $\delta(x, y) = 1$ if $x = y$ and $\delta(x, y) = 0$ otherwise.
Let $\mathcal X$ be the set of all possible colorings and $\beta > 0$. Define \begin{equation} \pi(\sigma) = \frac{1}{Z} e^{-\beta C(\sigma) } \,, \quad\text{where}\quad Z = \sum_{\sigma \in \mathcal X} e^{-\beta C(\sigma) } \,. \end{equation} Describe an algorithm that randomly generates colorings of the graph with distribution $\pi$. (That is, the probability of obtaining a coloring $\sigma$ from your algorithm should be exactly $\pi(\sigma)$. Alternately, the probability of obtaining a coloring $\sigma$ from your algorithm should approach $\pi(\sigma)$ as some parameter in your algorithm is made sufficiently large/small.)
Note: Your algorithm should not have an impractically large runtime when $G$ has $100$ vertices and $N = 2$. (Note, computing $Z$ takes an impractically large amount of time, so don’t propose an algorithm that uses the value of $Z$.) Also, if your algorithm relies on an algorithm we’ve done in class / homework, you only need to state the conditions required for the algorithm to work; you don’t have to check that those conditions apply to this particular situation.
Question 3.
Download this data file and this notebook.
-
Implement the missing function and turn in the completed notebook with the coding portion of this midterm (different submission link). If you’d like to check you got it right, the final output looks like this.
-
Explain the algorithm here. Be sure you explain how things you will likely need (e.g. number of iterations / realizations) are chosen in terms of the input parameters $α$ and $ε$.
Hint: Something you did on this homework helps figuring out the number of iterations / realizations needed.
Question 4
Let $X$ be a Markov chain on a finite state space $\mathcal X$ and $P$ be its transition matrix, and $\pi$ be the stationary distribution. Fix $\epsilon \in (0, 1)$ and let \begin{equation} N = \min\set[\Big]{ n \in \N \st \abs[\Big]{ \frac{P^n(x, y)}{\pi(y)} - 1} \leq \epsilon \text{ for every } x, y \in \mathcal X } \,. \end{equation}
-
Find constants $a, b > 0$ such that $\tmix(a \epsilon) \leq b N$, and prove your answer.
-
(This doesn’t rely on the previous part.) Show that for every $k, j \in \N$, and every $x, y \in \mathcal X$, we have \begin{equation} \abs[\Big]{ \frac{P^{k N + j}(x, y)}{\pi(y)} - 1} \leq \epsilon^k \end{equation}
Hint: First verify it for $k = 2$, $j = 0$.