Reading Seminar · Fall 2026

Flows, Diffusions, and Bayesian Inference

Organized by Justin Domke and Joohwan Ko · UMass Amherst
This seminar will cover recent advances at the intersection of flow-based generative modeling, diffusion processes, and Bayesian methods. Topics will include normalizing flows, diffusion models, inference methods based on diffusion, flow matching, and neural samplers. We will focus on recent research with state of the art results, paying particular attention to the underlying fundamental concepts (e.g. stochastic differential equations). By the end of the semester, participants should have a unified mathematical picture of how continuous-time dynamics can be learned to transport between distributions for both generation and inference.
Prerequisites
No strict prerequisite courses are needed, but we assume attendees are familiar with most of the topics covered in CS 688, CS 689, or equivalent—especially probabilistic inference methods. Some basic knowledge of flow-based models including normalizing flows and diffusion models would also be helpful for following the papers.
Below are possible topics and papers we might cover throughout the semester.
Flow Matching Foundations
Lipman, Chen, Ben-Hamu, Nickel, Le (2022)
Simulation-free training of CNFs via conditional vector field regression.
Also see: Flow Matching on General Geometries (Chen & Lipman, 2024) · Flow Matching Guide and Code (Lipman et al., 2024)
Albergo & Vanden-Eijnden (2022)
Independent, complementary derivation of flow matching via interpolation.
Liu, Gong, Liu (2022)
Straightening transport paths for fewer-step ODE sampling.
Also see: Scaling Rectified Flow Transformers for High-Resolution Image Synthesis (Esser et al., 2024 — Stable Diffusion 3)
Albergo, Boffi, Vanden-Eijnden (2023)
Unifies deterministic flows and stochastic diffusions under one construction.
Diffusion & Score Connections
Song, Sohl-Dickstein, Kingma, Kumar, Ermon, Poole (2021)
The SDE/ODE duality unifying score-based and diffusion models.
Karras, Aittala, Aila, Laine (2022)
Practical anatomy of diffusion: noise schedules, preconditioning, discretization.
Tong, Malkin, Fatras, Atanackovic, Zhang, Bengio, Wolf (2023)
Minibatch OT couplings for straighter, more efficient flow matching paths.
Fast Generation & Consistency
Song, Dhariwal, Chen, Sutskever (2023)
One-step generation by mapping trajectory points to endpoints.
Also see: Improved Techniques for Training Consistency Models (Song & Dhariwal, 2024) · Consistency Trajectory Models (Kim et al., 2024)
(2024)
Consistency models as flow map learning via stochastic interpolants.
Sampling & Stochastic Optimal Control
Vargas, Grathwohl, Doucet (2023)
Diffusion for sampling from unnormalized densities, not just generation.
Zhang & Chen (2021)
Sampling as stochastic optimal control via Feynman–Kac path integrals.
Also see: Stochastic Optimal Control Matching (Domingo-Enrich et al., 2024)
Domingo-Enrich, Drozdzal, Karrer, Chen (2024)
Memoryless SOC objective; avoids backprop through dynamics.
(2025)
Extends adjoint matching to sampling from unnormalized targets at scale.
Also see: Iterated Denoising Energy Matching (Akhound-Sadegh et al., 2024)
Bridges, Discrete Domains & Broader Connections
Shi, De Bortoli, Campbell, Deligiannidis (2023)
Entropy-regularized optimal transport via iterative matching.
Also see: I²SB: Image-to-Image Schrödinger Bridge (Liu et al., 2023)
Holderrieth et al. (2024)
Generalizes flow/diffusion matching to arbitrary Markov processes.
Also see: Discrete Flow Matching (Campbell et al., 2024)
(2025)
Symmetry-aware learned proposals for discrete sampling.