Figure 1

Empirical CDFs: Can they be used to approximate belief propagation updates, or any other kind of variational message passing algorithm?

We could use several variational methods that can be understood as using empirical CDFs for message passing; one candidate is Stein variational gradient descent message passing, which constructs the ensemble by solving an optimisation problem. Another might be Ensemble Kalman filtering, which uses a stochastic perturbation of a fixed population to find the posterior. That would be Gaussian Ensemble Belief Propagation.

This page is about the particle filter analogue, which would use an importance sampling-like update. How does that work? TBD

1 Basic

  1. TBC.

2 Expectation

The Expectation form reputedly has advantages (). TBC.

3 Stein variational gradient descent

Define a kernel over factors and Stein Variational Gradient Descent decomposes into local messages. Discovered simultaneously in 2018 by Wang, Zeng, and Liu () and Zhuo et al. (), and elaborated/expanded/varied in subsequent works. (; ).

I just met Joshua Mah who introduced me to Stein Variational Belief Propagation for Multi-Robot Coordination () but I have not digested it fully yet. TBC

4 References

Briers, Doucet, and Singh. 2005. Sequential Auxiliary Particle Belief Propagation.” In 2005 7th International Conference on Information Fusion.
Grooms, and Robinson. 2021. A Hybrid Particle-Ensemble Kalman Filter for Problems with Medium Nonlinearity.” PLOS ONE.
Ihler, and McAllester. 2009. Particle Belief Propagation.” In Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics.
Lienart, Teh, and Doucet. 2015. Expectation Particle Belief Propagation.” In Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2. NIPS’15.
MacKinlay, Tsuchida, Pagendam, et al. 2025. Gaussian Ensemble Belief Propagation for Efficient Inference in High-Dimensional Systems.” In Proceedings of the International Conference on Learning Representations (ICLR).
Mueller, Yang, and Rosenhahn. 2018. Slice Sampling Particle Belief Propagation.”
Naesseth, Christian, Lindsten, and Schon. 2015. Nested Sequential Monte Carlo Methods.” In Proceedings of the 32nd International Conference on Machine Learning.
Naesseth, Christian Andersson, Lindsten, and Schön. 2014. Sequential Monte Carlo for Graphical Models.” In Advances in Neural Information Processing Systems.
Paige, and Wood. 2016. Inference Networks for Sequential Monte Carlo in Graphical Models.” In Proceedings of The 33rd International Conference on Machine Learning.
Pavlasek, Mah, Xu, et al. 2024. Stein Variational Belief Propagation for Multi-Robot Coordination.”
Wang, Zeng, and Liu. 2018. Stein Variational Message Passing for Continuous Graphical Models.”
Zhou, and Qiu. 2023. Augmented Message Passing Stein Variational Gradient Descent.”
Zhuo, Liu, Shi, et al. 2018. Message Passing Stein Variational Gradient Descent.” In Proceedings of the 35th International Conference on Machine Learning.