# Particle variational message passing

Graphical inference using empirical distribution estimates

July 25, 2014 — March 23, 2024

approximation

Bayes

distributed

dynamical systems

ensemble

generative

graphical models

machine learning

Monte Carlo

networks

optimization

probabilistic algorithms

probability

sciml

signal processing

state space models

statistics

stochastic processes

swarm

time series

Empirical CDFs: Can they be used approximate belief propagation updates, or any other kind of variational message passing algorithm?

We could imagine are several variational methods that use empirical CDFs to do message passing; one obvious candidate is Stein variational gradient descent message passing which constructs the ensemble by solving an optimisation problem. Another is Ensemble Kalman filtering, which uses a stochastic perturbation of a fixed population to find the posterior.

This page is about the particle filter analogue, which would use an importance sampling-like update.

## 1 References

Briers, Doucet, and Singh. 2005. “Sequential Auxiliary Particle Belief Propagation.” In

*2005 7th International Conference on Information Fusion*.
Grooms, and Robinson. 2021. “A Hybrid Particle-Ensemble Kalman Filter for Problems with Medium Nonlinearity.”

*PLOS ONE*.
Ihler, and McAllester. 2009. “Particle Belief Propagation.” In

*Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics*.
Lienart, Teh, and Doucet. 2015. “Expectation Particle Belief Propagation.” In

*Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2*. NIPS’15.
Mueller, Yang, and Rosenhahn. 2018. “Slice Sampling Particle Belief Propagation.”

Naesseth, Christian, Lindsten, and Schon. 2015. “Nested Sequential Monte Carlo Methods.” In

*Proceedings of the 32nd International Conference on Machine Learning*.
Naesseth, Christian Andersson, Lindsten, and Schön. 2014. “Sequential Monte Carlo for Graphical Models.” In

*Advances in Neural Information Processing Systems*.
Paige, and Wood. 2016. “Inference Networks for Sequential Monte Carlo in Graphical Models.” In

*Proceedings of The 33rd International Conference on Machine Learning*.