Decoupling for Markov Chains
How can we rigorously quantify Monte Carlo error and assess convergence in modern MCMC methods such as the No-U-Turn Sampler? This question motivates new joint work with Victor de la … Read More
How can we rigorously quantify Monte Carlo error and assess convergence in modern MCMC methods such as the No-U-Turn Sampler? This question motivates new joint work with Victor de la … Read More
In many Bayesian inference problems, the geometry of the posterior distribution can vary dramatically in scale. A classic example is Neal’s funnel, where the state-of-the-art algorithm, the No-U-Turn Sampler (NUTS), … Read More
Markov Chain Monte Carlo (MCMC) methods are fundamental for sampling from complex probability distributions, but many widely used algorithms either rely on gradients (like NUTS) and/or struggle with high-dimensional, multi-scale … Read More
Traditional methods like Gibbs sampling or randomized Kaczmarz rely heavily on specific coordinate systems, which can limit their efficiency—especially in ill-conditioned settings. But what happens when we step away from … Read More
The No-U-Turn Sampler (NUTS) is the go-to method for sampling in probabilistic programming languages like Stan, PyMC3, NIMBLE, Turing, and NumPyro. However, due to its recursive architecture, even proving its … Read More
Imagine a No-U-Turn Sampler that can adapt both its path-length and step-size on the fly, responding to the local geometry of the target distribution, while still preserving detailed balance. In … Read More
Anyone who has ever used Hamiltonian Monte Carlo samplers before has probably encountered the so-called tuning problem: in short, it is not at all obvious how to pick the algorithm’s … Read More
With Tore Selland Kleppe (Stavanger, Norway), released a preprint presenting 2.5/3.5-order L^2-accurate Randomized Runge-Kutta-Nyström methods to approximate the Hamiltonian flow within unadjusted Hamiltonian Monte Carlo and unadjusted kinetic Langevin chains.
Together with Katharina Schuh (TU Wien), released a preprint on a nonlinear generalization of Hamiltonian Monte Carlo for sampling from nonlinear probability measures of mean-field type.
Together with Stefan Oberdörster (Bonn), released a preprint presenting a new tool for total variation mixing time analysis of Metropolis-adjusted Markov kernels via couplings without restrictive assumptions on either the … Read More