Particle variational message passing

Graphical inference using empirical distribution estimates

July 25, 2014 — November 14, 2024

approximation
Bayes
distributed
dynamical systems
generative
graphical models
machine learning
Monte Carlo
networks
optimization
particle
probabilistic algorithms
probability
sciml
signal processing
state space models
statistics
stochastic processes
swarm
time series
Figure 1

Empirical CDFs: Can they be used to approximate belief propagation updates, or any other kind of variational message passing algorithm?

We could use several variational methods that could be understood as using empirical CDFs to do message passing; one obvious candidate is Stein variational gradient descent message passing, which constructs the ensemble by solving an optimisation problem. Another might be Ensemble Kalman filtering, which uses a stochastic perturbation of a fixed population to find the posterior. That would be Gaussian Ensemble Belief Propagation.

This page is about the particle filter analogue, which would use an importance sampling-like update. How does that work? TBD

1 Basic

Ihler and McAllester (2009)

2 Expectation

The Expectation form reputedly works better (Lienart, Teh, and Doucet 2015).

3 Stein variational gradient descent

Define a kernel over factors and Stein Variational Gradient Descent decomposes into local messages. Discovered simultaneously in 2018 by Wang, Zeng, and Liu (2018) and Zhuo et al. (2018).

To read: Zhou and Qiu (2023).

4 References

Briers, Doucet, and Singh. 2005. Sequential Auxiliary Particle Belief Propagation.” In 2005 7th International Conference on Information Fusion.
Grooms, and Robinson. 2021. A Hybrid Particle-Ensemble Kalman Filter for Problems with Medium Nonlinearity.” PLOS ONE.
Ihler, and McAllester. 2009. Particle Belief Propagation.” In Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics.
Lienart, Teh, and Doucet. 2015. Expectation Particle Belief Propagation.” In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2. NIPS’15.
MacKinlay, Tsuchida, Pagendam, et al. 2024. Gaussian Ensemble Belief Propagation for Efficient Inference in High-Dimensional Systems.”
Mueller, Yang, and Rosenhahn. 2018. Slice Sampling Particle Belief Propagation.”
Naesseth, Christian, Lindsten, and Schon. 2015. Nested Sequential Monte Carlo Methods.” In Proceedings of the 32nd International Conference on Machine Learning.
Naesseth, Christian Andersson, Lindsten, and Schön. 2014. Sequential Monte Carlo for Graphical Models.” In Advances in Neural Information Processing Systems.
Paige, and Wood. 2016. Inference Networks for Sequential Monte Carlo in Graphical Models.” In Proceedings of The 33rd International Conference on Machine Learning.
Wang, Zeng, and Liu. 2018. Stein Variational Message Passing for Continuous Graphical Models.”
Zhou, and Qiu. 2023. Augmented Message Passing Stein Variational Gradient Descent.”
Zhuo, Liu, Shi, et al. 2018. Message Passing Stein Variational Gradient Descent.” In Proceedings of the 35th International Conference on Machine Learning.