diff --git a/_psets/2.md b/_psets/2.md
index 16b433cab4b1705b9a12ba5b8e2114f05f0d99ad..90ad9f8b11f6c0a100638d2269f8b6411776133d 100644
--- a/_psets/2.md
+++ b/_psets/2.md
@@ -111,9 +111,11 @@ The wavelength of visible light is about $$\num{500e-9} \si{m}$$, so the energy
 be
 
 $$
-E = \frac{h c}{\lambda} = \frac{\num{6.626e-34} \si{J.s} \cdot \num{3e8} \si{m/s}}
-                               {\num{500e-9} \si{m}}
-= \num{3.8e-19} \si{J}
+\begin{align*}
+E &= \frac{h c}{\lambda} \\
+&= \frac{\num{6.626e-34} \si{J.s} \cdot \num{3e8} \si{m/s}} {\num{500e-9} \si{m}} \\
+&= \num{3.8e-19} \si{J}
+\end{align*}
 $$
 
 Thus $$10^4$$ photons per second is $$\num{3.8e-15} \si{W}$$, and $$10^{12}$$ photons per second is
@@ -206,6 +208,7 @@ $$
 
 ## 3.4
 
+{:.question}
 This problem is much harder than the others. Consider a stochastic process $$x(t)$$ that randomly
 switches between x = 0 and x = 1. Let $$\alpha \mathrm{d}t$$ be the probability that it makes a
 transition from 0 to 1 during the interval $$\mathrm{d}t$$ if it starts in x = 0, and let $$\beta
@@ -218,29 +221,204 @@ starts in x = 1.
 Write a matrix differential equation for the change in time of the probability $$p_0(t)$$ to be in
 the 0 state and the probability $$p_1(t)$$ to be in the 1 state.
 
+$$
+\frac{d}{dt} \begin{bmatrix} p_0(t) \\ p_1(t) \end{bmatrix}
+= \begin{bmatrix} -\alpha & \beta \\ \alpha & -\beta \end{bmatrix}
+\begin{bmatrix} p_0(t) \\ p_1(t) \end{bmatrix}
+$$
+
 ### (b)
 
 {:.question}
 Solve this by diagonalizing the 2 × 2 matrix.
 
+Solving a system of ODEs isn't necessary here, since $$p_1(t) = 1 - p_0(t)$$. So we just need to
+solve
+
+$$
+\begin{align*}
+\frac{d}{dt} p_0(t) &= -\alpha p_0(t) + \beta (1 - p_0(t)) \\
+&= \beta -(\alpha + \beta) p_0(t)
+\end{align*}
+$$
+
+Since the derivative is proportional to the function, it's solved generally by an exponential
+
+$$
+p_0(t) = A + B e^{-(\alpha + \beta) t}
+$$
+
+Then
+
+$$
+\begin{align*}
+\frac{d}{dt} p_0(t) &= -(\alpha + \beta) B e^{-(\alpha + \beta) t} \\
+&= (\alpha + \beta) A - (\alpha + \beta) (A + B e^{-(\alpha + \beta) t}) \\
+&= (\alpha + \beta) A - (\alpha + \beta) p_0(t)
+\end{align*}
+$$
+
+which implies that $$A = \beta / (\alpha + \beta)$$. $$B$$ is determined by $$p_0(t)$$:
+
+$$
+p_0(0) = \frac{\beta}{\alpha + \beta} + B
+$$
+
+So putting everything together we have
+
+$$
+\begin{align*}
+p_0(t) &= \frac{\beta}{\alpha + \beta}
+          + \left(p_0(0) - \frac{\beta}{\alpha + \beta} \right) e^{-(\alpha + \beta) t} \\
+p_1(t) &= \frac{\alpha}{\alpha + \beta}
+          - \left(p_0(0) - \frac{\beta}{\alpha + \beta} \right) e^{-(\alpha + \beta) t} \\
+&= \frac{\alpha}{\alpha + \beta}
+   + \left(p_1(0) - \frac{\alpha}{\alpha + \beta} \right) e^{-(\alpha + \beta) t}
+\end{align*}
+$$
+
 ### (c)
 
 {:.question}
-Use this solution to find the autocorrelation function $$hx(t)x(t + \tau)i$$.
+Use this solution to find the autocorrelation function $$\langle x(t)x(t + \tau) \rangle$$.
+
+For positive $$\tau$$,
+
+$$
+\begin{align*}
+\langle x(t) x(t + \tau) \rangle
+&= \sum_{i, j \in \{0, 1\} \times \{0, 1\}} p(x(t) = i \cap x(t + \tau) = j) i j \\
+&= p(x(t) = 1 \cap x(t + \tau) = 1) \\
+&= p_1(t + \tau | x(t) = 1) p_1(t) \\
+&= p_1(\tau | p_1(0) = 1) p_1(t) \\
+&= \left( \frac{\alpha}{\alpha + \beta} + \left(1 - \frac{\alpha}{\alpha + \beta} \right)
+   e^{-(\alpha + \beta) \tau} \right) \frac{\alpha}{\alpha + \beta} \\
+&= \left( \frac{\alpha}{\alpha + \beta} + \frac{\beta}{\alpha + \beta}
+   e^{-(\alpha + \beta) \tau} \right) \frac{\alpha}{\alpha + \beta} \\
+&= \frac{\alpha}{(\alpha + \beta)^2} \left( \alpha + \beta e^{-(\alpha + \beta) \tau} \right)
+\end{align*}
+$$
+
+By symmetry this is an even function. That is, $$\langle x(t) x(t + \tau) \rangle = \langle x(t)
+x(t - \tau) \rangle$$.
 
 ### (d)
 
 {:.question}
 Use the autocorrelation function to show that the power spectrum is a Lorentzian.
 
+By Wiener-Khinchin,
+
+$$
+\begin{align*}
+S(f) &= \int_\mathbb{R} \frac{\alpha}{(\alpha + \beta)^2}\left( \alpha + \beta e^{-(\alpha + \beta)
+        |\tau|} \right) e^{-2 \pi i f \tau} \mathrm{d} \tau \\
+&= \frac{\alpha \beta}{(\alpha + \beta)^2} \left( \int_\mathbb{R} \frac{\alpha}{\beta} e^{-2 \pi i f
+   \tau} \mathrm{d} \tau + \int_\mathbb{R} e^{-(\alpha + \beta) |\tau| -2 \pi i f \tau} \mathrm{d}
+   \tau \right)
+\end{align*}
+$$
+
+The first integral evaluates to a delta function $$\alpha / \beta \delta(f)$$. The second can be
+broken into a sum of two integrals over the positive and negative halves of $$\mathbb{R}$$:
+
+$$
+\begin{align*}
+\int_0^\infty e^{-\tau ((\alpha + \beta) + 2 \pi i f)} \mathrm{d} \tau
+&= \left[ \frac{-1}{(\alpha + \beta) + 2 \pi i f}
+   e^{-\tau ((\alpha + \beta) + 2 \pi i f)} \right]_0^\infty \\
+&= \frac{1}{(\alpha + \beta) + 2 \pi i f} \\
+\int_{-\infty}^0 e^{\tau ((\alpha + \beta) - 2 \pi i f)} \mathrm{d} \tau
+&= \left[ \frac{1}{(\alpha + \beta) - 2 \pi i f}
+   e^{\tau ((\alpha + \beta) - 2 \pi i f)} \right]_{-\infty}^0 \\
+&= \frac{1}{(\alpha + \beta) - 2 \pi i f} \\
+\end{align*}
+$$
+
+Putting everything together,
+
+$$
+\begin{align*}
+S(f) &= \frac{\alpha \beta}{(\alpha + \beta)^2} \left( \frac{\alpha}{\beta} \delta(f)
+        + \frac{1}{(\alpha + \beta) + 2 \pi i f} + \frac{1}{(\alpha + \beta) - 2 \pi i f} \right) \\
+&= \frac{\alpha \beta}{(\alpha + \beta)^2} \left( \frac{\alpha}{\beta} \delta(f)
+   + \frac{2 (\alpha + \beta)}{(\alpha + \beta)^2 + (2 \pi f)^2} \right) \\
+&= \frac{\alpha^2}{(\alpha + \beta)^2} \delta(f) + \frac{\alpha \beta}{(\alpha + \beta)^2}
+   \frac{2 (\alpha + \beta)}{(\alpha + \beta)^2 + (2 \pi f)^2} \\
+&= \frac{\alpha^2}{(\alpha + \beta)^2} \delta(f) + \frac{\alpha \beta}{(\alpha + \beta)^2}
+   \frac{2 (\alpha + \beta)^{-1}}{1 + \left( \frac{2 \pi f}{\alpha + \beta} \right)^2} \\
+\end{align*}
+$$
+
+Ignoring the delta function, up to a constant factor this is a Lorentzian distribution with
+$$\tau = 1 / (\alpha + \beta)$$.
+
 ### (e)
 
 {:.question}
 At what frequency is the magnitude of the Lorentzian reduced by half relative to its low-frequency
 value?
 
+Looking only at the Lorentzian portion,
+
+$$
+S(0) = \frac{2 \alpha \beta}{(\alpha + \beta)^3}
+$$
+
+So we must plug this in and solve for $$f$$:
+
+$$
+\begin{align*}
+\frac{\alpha \beta}{(\alpha + \beta)^2}
+\frac{2 (\alpha + \beta)^{-1}}{1 + \left( \frac{2 \pi f}{\alpha + \beta} \right)^2}
+&= \frac{1}{2} \frac{2 \alpha \beta}{(\alpha + \beta)^3} \\
+\frac{1}{1 + \left( \frac{2 \pi f}{\alpha + \beta} \right)^2} &= \frac{1}{2} \\
+1 &= \left( \frac{2 \pi f}{\alpha + \beta} \right)^2 \\
+f &= \frac{\alpha + \beta}{2 \pi}
+\end{align*}
+$$
+
 ### (f)
 
 {:.question}
 For a thermally activated process, show that a flat distribution of barrier energies leads to a
-distribution of switching times $$p(\tau) \propto 1/\tau$$, and in turn to $$S(f) \propto 1/\nu$$.
+distribution of switching times $$p(\tau) \propto 1/\tau$$, and in turn to $$S(f) \propto 1/f$$.
+
+We are assuming the distribution of barrier energies $$p(E)$$ is constant. According to (3.37), for
+a thermally activated process the characteristic switching time is a function of the energy: $$\tau
+= \tau_0 e^{E/kT}$$. So to get the distribution $$p(\tau)$$, we just need to transform $$p(E)$$
+accordingly. In particular,
+
+$$
+p(\tau) = p(E) \frac{k T}{\tau}
+$$
+
+Let's prove that this is the case. Given any random variable $$X$$, and monotonic function $$f$$,
+let $$Y = f(X)$$. Then the cumulative distribution functions are related by
+
+$$
+\begin{align*}
+p(Y \leq y) &= p(f(X) \leq y) \\ &= p(X \leq f^{-1}(y))
+\end{align*}
+$$
+
+By the fundamental theorem of calculus, the probability density function is the derivative of the
+cumulative distribution function. So by employing the chain rule we find that
+
+$$
+\begin{align*}
+p(y) &= \frac{d}{dy} p(Y \leq y) \\
+&= \frac{d}{dy} p(X \leq f^{-1}(y)) \\
+&= p(f^{-1}(y)) \frac{d}{dy} f^{-1}(y)
+\end{align*}
+$$
+
+So our result above for $$p(\tau)$$ follows because $$E = k T \log(\tau / \tau_0)$$, so $$dE / d\tau
+= k T / \tau$$.
+
+
+
+$$
+\begin{align*}
+\end{align*}
+$$