MCMC Methods for Functions: Modifying Old Algorithms to Make Them Faster
Top Cited Papers
Open Access
- 1 August 2013
- journal article
- Published by Institute of Mathematical Statistics in Statistical Science
- Vol. 28 (3), 424-446
- https://doi.org/10.1214/13-sts421
Abstract
Many problems arising in applications result in the need to probe a probability distribution for functions. Examples include Bayesian nonparametric statistics and conditioned diffusion processes. Standard MCMC algorithms typically become arbitrarily slow under the mesh refinement dictated by nonparametric description of the unknown function. We describe an approach to modifying a whole range of MCMC methods, applicable whenever the target measure has density with respect to a Gaussian process or Gaussian random field reference measure, which ensures that their speed of convergence is robust under mesh refinement. Gaussian processes or random fields are fields whose marginal distributions, when evaluated at any finite set of $N$ points, are $\mathbb{R}^{N}$-valued Gaussians. The algorithmic approach that we describe is applicable not only when the desired probability measure has density with respect to a Gaussian process or Gaussian random field reference measure, but also to some useful non-Gaussian reference measures constructed through random truncation. In the applications of interest the data is often sparse and the prior specification is an essential part of the overall modelling strategy. These Gaussian-based reference measures are a very flexible modelling tool, finding wide-ranging application. Examples are shown in density estimation, data assimilation in fluid mechanics, subsurface geophysics and image registration. The key design principle is to formulate the MCMC method so that it is, in principle, applicable for functions; this may be achieved by use of proposals based on carefully chosen time-discretizations of stochastic dynamical systems which exactly preserve the Gaussian reference measure. Taking this approach leads to many new algorithms which can be implemented via minor modification of existing algorithms, yet which show enormous speed-up on a wide range of applied problems.
Keywords
Other Versions
This publication has 40 references indexed in Scilit:
- Bayesian data assimilation in shape registrationInverse Problems, 2013
- Riemann Manifold Langevin and Hamiltonian Monte Carlo MethodsJournal of the Royal Statistical Society Series B: Statistical Methodology, 2011
- Inverse problems: A Bayesian perspectiveActa Numerica, 2010
- The variational particle-mesh method for matching curvesJournal of Physics A: Mathematical and Theoretical, 2008
- Analysis of SPDEs arising in path sampling part II: The nonlinear caseThe Annals of Applied Probability, 2007
- Bayesian aspects of some nonparametric problemsThe Annals of Statistics, 2000
- A note on Metropolis-Hastings kernels for general state spacesThe Annals of Applied Probability, 1998
- Optimal Scaling of Discrete Approximations to Langevin DiffusionsJournal of the Royal Statistical Society Series B: Statistical Methodology, 1998
- Weak convergence and optimal scaling of random walk Metropolis algorithmsThe Annals of Applied Probability, 1997
- Hybrid Monte CarloPhysics Letters B, 1987