Variational message passing with high-dimensional and functional nodes

November 11, 2022 — November 2, 2024

algebra
approximation
Bayes
distributed
dynamical systems
generative
graphical models
machine learning
Monte Carlo
networks
neural nets
optimization
particle
probabilistic algorithms
probability
signal processing
state space models
statistics
stochastic processes
swarm
time series
Figure 1

How to generalise variational message passing to high-dimensional and functional nodes?

1 Ensemble methods

2 Particle-filter-style

3 References

Bao, Chipilski, Liang, et al. 2024. Nonlinear Ensemble Filtering with Diffusion Models: Application to the Surface Quasi-Geostrophic Dynamics.”
Bao, Zhang, and Zhang. 2024. An Ensemble Score Filter for Tracking High-Dimensional Nonlinear Dynamical Systems.”
Beskos, Crisan, Jasra, et al. 2017. A Stable Particle Filter for a Class of High-Dimensional State-Space Models.” Advances in Applied Probability.
Boopathy, Muppidi, Yang, et al. 2024. Resampling-Free Particle Filters in High-Dimensions.”
MacKinlay, Tsuchida, Pagendam, et al. 2024. Gaussian Ensemble Belief Propagation for Efficient Inference in High-Dimensional Systems.”
Ranganathan. 2007. “A Unifying View of Message Passing Algorithms for Gaussian MRFs.”
Rombach, Blattmann, Lorenz, et al. 2022. High-Resolution Image Synthesis with Latent Diffusion Models.” In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
Rozet, and Louppe. 2023a. Score-Based Data Assimilation.”
———. 2023b. Score-Based Data Assimilation for a Two-Layer Quasi-Geostrophic Model.”
Snyder, Bengtsson, Bickel, et al. 2008. Obstacles to High-Dimensional Particle Filtering.” Monthly Weather Review.
Vahdat, Kreis, and Kautz. 2021. Score-Based Generative Modeling in Latent Space.” In.
Wüthrich, Bohg, Kappler, et al. 2015. The Coordinate Particle Filter - A Novel Particle Filter for High Dimensional Systems.” In 2015 IEEE International Conference on Robotics and Automation (ICRA).
Zhuo, Liu, Shi, et al. 2018. Message Passing Stein Variational Gradient Descent.” In Proceedings of the 35th International Conference on Machine Learning.