# [Trivial Matters] Change of Variables in MCMC

This post is about change of variables in Markov chain Monte Carlo (MCMC), which is used quite often when the target distribution is supported on a subset of ${\mathbb{R}^n}$. For example, the Exponential distribution and the Log-Normal distribution are only supported on positive reals.

Consider a target distribution ${\pi(x)}$ that is supported on a subset ${S \subset \mathbb{R}^n}$. If we use a random walk proposal ${q_X(x' \mid x) = \mathrm{MVN}(x' ; x,\Sigma)}$, then we might end up with a proposal ${x'}$ such that ${\pi(x') = 0}$ and, this might cause too few acceptance in the MCMC chain. If we can find a transformation ${h:D \to \mathbb{R}^n}$ that is one-to-one, differentiable and spans ${\mathbb{R}^n}$, then we can consider a proposal ${x' = h^{-1}(y')}$ where ${y' \sim q_Y(\cdot \mid y = h(x))}$. This proposal always yields a proposal ${x'}$ such that ${\pi(x') > 0.}$

Of course, when we employ such a transformation in the proposal kernel, we need to be careful about evaluating the proposal densities. We know that the acceptance probability is ${\alpha(x,x') = 1 \wedge \frac{\pi(x') q(x \mid x')}{\pi(x) q(x' \mid x)}}$, and it should be no surprise that ${q_X(x' \mid x) \not= q_Y(y' \mid y)}$ unless ${h}$ is the identity map.

Let’s work out the acceptance ratio together carefully. Recall that change of variables proceeds as follows: when ${Y \sim f_Y(y) }$ and we consider the transformation ${ y = h^{-1}(x)}$, the pdf of ${X = h^{-1}(Y)}$ is

$\displaystyle f_X(x) = f_Y(h(x))|J_{h}(x)|.$

When we apply this to the kernels ${q_Y}$ and ${q_X}$ we get that

$\displaystyle q_X(x' \mid x ) = q_Y(h(x') \mid h(x)) \cdot |J_{h}(x')|.$

Example 1 {Symmetric proposal on transformed space} If ${q_Y(y' \mid y)}$ is a symmetric proposal, then the acceptance probability becomes

$\displaystyle \alpha(x,x') = 1 \wedge \frac{\pi(x') |J_{h}(x)|}{\pi(x)|J_{h}(x')|} .$

Here are two common transformations.

Example 2 (Log-transformation for ${x}$ supported on ${\mathbb{R}_+}$)

If ${h = \log}$, then ${|J_{h}(x)| = 1 / x}$ and acceptance probability is

$\displaystyle \alpha(x,x') = 1 \wedge \frac{\pi(x')x'}{\pi(x)x}.$

Example 3 (Logit transformation for ${x}$ supported on ${(0,1)}$) If ${h(x) = \log(\frac{x}{1 - x})}$,then the inverse transformation is ${h^{-1}(y) = \frac{exp(y)}{1 + \exp(y)}.}$ The acceptance probability is

$\displaystyle \alpha(x,x') = 1 \wedge \frac{\pi(x')x'(1-x')}{\pi(x)x(1-x)}.$

This post is the first one in the category `trivial matters’, where I formally write down some notes to myself about tricks and facts that I repeatedly use but (unfortunately) need to re-derive everytime I use them.

## Author: PhyllisWithData

Statistics PhD student at Harvard University.