Tuesday 18 November 2025

Speakers: Nikolay Malkin, Sanghyeok Choi, Kirill Tamogashev

Title: How to sample (and more) with diffusions (and more)

Abstract: This three-part presentation will be about sampling Bayesian posterior distributions using Monte Carlo and learning-based methods. This ubiquitous problem in statistics, machine learning, and scientific applications — inverse imaging, molecular dynamics, uncertainty quantification using Bayesian models, etc.  can be formulated as that of drawing samples from a target distribution given only an unnormalised density function that can be queried at any point. NM will present background on the sampling problem and solutions using generative models, particularly those using reverse diffusion process as a variational approximation to an intractable target. SC will show how such generative models can act as adaptive proposal kernels for Monte Carlo sampling algorithms and, conversely, how Monte Carlo algorithms can assist the training of these proposals, the subject of his recent paper. KT will present generalisations to learning transports between a pair of arbitrary distributions, allowing inference of stochastic dynamics without data samples and, when applied in latent spaces of pretrained generative models, data-free style transfer for images.

Biographies: Sanghyeok Choi is a first-year PhD student in Informatics supervised by Nikolay Malkin and Henry Gouk. He is focusing on machine learning algorithms for probabilistic inference, with applications in structure learning and generative modelling. Before starting his PhD, he worked on improved exploration methods in reinforcement learning (mainly GFlowNets) and their application to optimisation problems. He earned his MS in Industrial and Systems Engineering from KAIST and his BSc in Industrial Engineering from Seoul National University. To our great misfortune, he thinks amortised structure inference will satisfy safety regulations.

Kirill Tamogashev is a first-year PhD student supervised by Nikolay Malkin, focusing on Bayesian posterior inference and diffusion-based generative modelling. From his BSc in Economics he remembers two things: marginalism and the Vandermonde determinant. Although he struggles with applications of the latter, the knowledge of the former convinced him that the principles of Marshall should not only be applied to commodities, but also to latent variables and probability distributions. This led him to an MSc in Applied Mathematics from Skoltech and HSE, where he learnt that distributions can not only be marginalised, but also inferred – Thomas Bayes taught him that. This spurred his interest in a research career and brought him to Edinburgh, where he now focuses on learning marginal distributions using Bayesian inference — presumably to find the marginal utility of a latent variable.

Nikolay Malkin is a Chancellor's Fellow, working on algorithms for probabilistic inference and Bayesian machine learning. Before the University of Edinburgh, they were a postdoctoral researcher Mila — Québec AI Institute in Montréal, which reinforced their belief — formed in the course of PhD research in algebraic geometry at Yale University — that Jensen's inequality is a historical injustice that must be rectified by variational methods. Despite never having never proved an inequality without the aid of superhuman inspiration, they're open to collaboration on this front.