Neural denoising diffusion models with non-Gaussian increments
2024-04-16 — 2025-03-16
Wherein Poisson‑like increments are considered, and connections to Poisson Flow Generative Models via an extra‑dimension embedding parameter D are drawn, while perturbation‑based objectives are described.
approximation
Bayes
generative
Monte Carlo
neural nets
optimization
probabilistic algorithms
probability
score function
statistics
Placeholder. Diffusion models that don’t assume Gaussian increments in their noise processes. Are these useful?
Related but distinct: Diffusion models on discrete state spaces where the terminal distribution is categorical.
2 References
Chung, Kim, Mccann, et al. 2023. “Diffusion Posterior Sampling for General Noisy Inverse Problems.” In.
Han, Zheng, and Zhou. 2022. “CARD: Classification and Regression Diffusion Models.”
Hoogeboom, Nielsen, Jaini, et al. 2021. “Argmax Flows and Multinomial Diffusion: Learning Categorical Distributions.” In Proceedings of the 35th International Conference on Neural Information Processing Systems. NIPS ’21.
Kim, and Ye. 2021. “Noise2Score: Tweedie’s Approach to Self-Supervised Image Denoising Without Clean Images.” In.
Liu, Luo, Xu, et al. 2023. “GenPhys: From Physical Processes to Generative Models.”
Nie, Zhu, You, et al. 2025. “Large Language Diffusion Models.”
Xu, Liu, Tegmark, et al. 2022. “Poisson Flow Generative Models.” In Proceedings of the 36th International Conference on Neural Information Processing Systems. NIPS ’22.
Xu, Liu, Tian, et al. 2023. “PFGM++: Unlocking the Potential of Physics-Inspired Generative Models.” In Proceedings of the 40th International Conference on Machine Learning. ICML’23.
Yang, Zhang, Song, et al. 2023. “Diffusion Models: A Comprehensive Survey of Methods and Applications.” ACM Computing Surveys.
