# Tuning an MCMC sampler

April 30, 2020 β April 30, 2020

Designing MCMC transition density, possibly via the proposal density in rejection sampling, by optimisation for optimal mixing.

The simplest way to do this is to do a βpilotβ run to estimate optimal mixing kernels then use the adapted mixing kernels, discarding the suspect samples from the pilot run as suspect. This wastes some effort but is theoretically simple. Alternatively you could do this dynamically, online, which is called Adaptive MCMC. There are then some theoretical wrinkles.

I do wish to maximise the mixing rate by some criterion. If I already know my mixing rate is bad without optimising, which is why I am optimising, how do I get the simulations against which to conduct the optimisation? How do we optimise simultaneously for maximising mixing rate and minimising rejection rate? Fearnhead and Taylor (2013) summarise some options here for an objective function. One that seems sufficient for publication of typical MCMC papers is Expected Squared Jump Distance, ESJD (which is more precisely an expected squared Mahalanobis distance) between samples, which minimises the lag-1 autocorrelation, which is in practice most of what we do.

## 1 Proposal density

Designing the proposal density is often easy for an independent rejection sampler. That is precisely the cross-entropy method. For Markov chain, though the success criterion is muddier. AFAICT the cross entropy trick does not apply for non-i.i.d. samples.

## 2 Transition density

π

In Sequential Monte Carlo, which is not MCMC, we do not need to be so sensitive to changing the proposal parameters, since there is no stationary distribution argument. See Fearnhead and Taylor (2013).

## 4 Variational inference

What is Hamiltonian Variational Inference? Does that fit under this heading? π

## 5 References

Caterini, Doucet, and Sejdinovic. 2018. In Advances in Neural Information Processing Systems.
Fearnhead, and Taylor. 2013. Bayesian Analysis.
Mathew, Bauer, Koistinen, et al. 2012. Heredity.
Norton, and Fox. 2016. arXiv:1610.00781 [Math, Stat].
Roberts, and Rosenthal. 2009. βExamples of Adaptive MCMC.β Journal of Computational and Graphical Statistics.
βββ. 2014. Annals of Applied Probability.
Salimans, Kingma, and Welling. 2015. In Proceedings of the 32nd International Conference on Machine Learning (ICML-15). ICMLβ15.
Schuster, Strathmann, Paige, et al. 2017. In ECML-PKDD 2017.