This seminar will cover recent advances at the intersection of flow-based generative modeling, diffusion processes, and Bayesian methods. Topics will include normalizing flows, diffusion models, inference methods based on diffusion, flow matching, and neural samplers. We will focus on recent research with state of the art results, paying particular attention to the underlying fundamental concepts (e.g. stochastic differential equations). By the end of the semester, participants should have a unified mathematical picture of how continuous-time dynamics can be learned to transport between distributions for both generation and inference.
Prerequisites
No strict prerequisite courses are needed, but we assume attendees are familiar with most of the topics covered in CS 688, CS 689, or equivalent—especially probabilistic inference methods. Some basic knowledge of flow-based models including normalizing flows and diffusion models would also be helpful for following the papers.
Below are possible topics and papers we might cover throughout the semester.