Neural denoising diffusion models
Denoising diffusion probabilistic models (DDPMs), score-based generative models, generative diffusion processes, neural energy models…
November 11, 2021 — December 6, 2023
Placeholder.
AFAICS, generative models using score-matching to learn and Langevin MCMC to sample. There are various tricks needed to to do it with successive denoising steps and interpretation in terms of diffusion SDEs. I am vaguely aware that this oversimplifies a rich and interesting history of convergence of many useful techniques, but have not invested enough time to claim actual expertise.
1 Training: score matching
Denoising score matching Hyvärinen (2005). See score matching or McAllester (2023) for an introduction to the general idea.
2 Sampling: Langevin dynamics
See Langevin samplers.
3 Image generation in particular
See image generation with diffusion.
4 Latent
4.1 Generic
4.2 CLIP
Radford et al. (2021)
5 Diffusion on weird spaces
5.1 Proteins
Baker Lab (Torres et al. 2022; Watson et al. 2022)
6 Incoming
- Lilian Weng, What are Diffusion Models?
- Yang Song, Generative Modeling by Estimating Gradients of the Data Distribution
- Sander Dieleman, Diffusion models are autoencoders
- CVPR tutorial, Denoising Diffusion-based Generative Modeling: Foundations and Applications Accompanying video
- What’s the score? (Review of latest Score Based Generative Modeling papers.)
- Anil Ananthaswamy, The Physics Principle That Inspired Modern AI Art
Suggestive connection to thermodynamics (Sohl-Dickstein et al. 2015).