

In RStudio editor and in R console, ‘alt’ + ‘-‘ gives us ‘<-'.
This is indeed very trivial but really deserves a post!
In RStudio editor and in R console, alt
+ -
gives us <-
.
Such great news for my fingers!
This post is about Stein’s lemma, which first appeared in the landmark paper of Stein in 1981. This lemma is leads to Stein’s unbiased risk estimator and is useful for proving central limit theorems. It has also been called `Gaussian integration by parts’, which is in fact a high level description of the proof.
Lemma 1 (Stein’s lemma) If
follows the standard normal distribution, then
if the expectations are well-defined.
Proof: If is `nice’, then
It is also convinient to denote as the standard Normal density and remember that
Stein’s lemma can be generalized to exponential family distributions. In particular, for multivariate normals, if and
is any differentiable estimator, then we have
This is Equation (12.56) in `Computer Age Statistical Inference’ by Efron and Hastie.
This post is the second in the category `trivial matters’, where I formally write down some notes to myself about identities, tricks, and facts that I repeatedly use but (unfortunately) need to re-derive everytime I use them. Although these posts are short, they discuss important topics. The term `trivial matters’ is used as a sarcasm, because my blood boils everytime I see terms like `obviously’ or `it is trivial to show that …’ when I grade students’ homeworks.
This post is about change of variables in Markov chain Monte Carlo (MCMC), which is used quite often when the target distribution is supported on a subset of . For example, the Exponential distribution and the Log-Normal distribution are only supported on positive reals.
Consider a target distribution that is supported on a subset
. If we use a random walk proposal
, then we might end up with a proposal
such that
and, this might cause too few acceptance in the MCMC chain. If we can find a transformation
that is one-to-one, differentiable and spans
, then we can consider a proposal
where
. This proposal always yields a proposal
such that
Of course, when we employ such a transformation in the proposal kernel, we need to be careful about evaluating the proposal densities. We know that the acceptance probability is , and it should be no surprise that
unless
is the identity map.
Let’s work out the acceptance ratio together carefully. Recall that change of variables proceeds as follows: when and we consider the transformation
, the pdf of
is
When we apply this to the kernels and
we get that
Example 1 {Symmetric proposal on transformed space} If
is a symmetric proposal, then the acceptance probability becomes
Here are two common transformations.
Example 2 (Log-transformation for
supported on
)
If, then
and acceptance probability is
Example 3 (Logit transformation for
supported on
) If
,then the inverse transformation is
The acceptance probability is
This post is the first one in the category `trivial matters’, where I formally write down some notes to myself about tricks and facts that I repeatedly use but (unfortunately) need to re-derive everytime I use them.