Particle filters

incorporating Interacting Particle Systems, Sequential Monte Carlo and a profusion of other simultaneous-discovery names

July 25, 2014 — March 24, 2023

Bayes
Monte Carlo
probabilistic algorithms
probability
sciml
signal processing
state space models
statistics
swarm
time series
Figure 1

A Monte Carlo algorithm which updates a population of samples with a nested update. The easiest entry point is IMO to think about random-sample generalisation of state filter models via importance sampling. These are classically considered cousins to the linear Gaussian Kalman filter, applicable to more challenging models at the cost of using Monte Carlo approximations.

This has nothing to do with filters for particulate matter as seen in respirators.

There is too much confusing and unhelpful terminology here, and I am only at the fringe of this field so I will not attempt to typologize. Let us clear up the main stumbling block though: somehow the theoretical basis field has coalesced under the banner of interacting particle systems which is an awful unsearchable name which could mean anything, and indeed does in other disciplines. Wikipedia disambiguates this problem with the gritty and also abstruse Mean Field Particle Methods. In practical applications we talk about particle filters, or sequential Monte Carlo, or bootstrap filters, or iterated importance sampling and these all mean confusingly similar things.

1 Introductions

Easy to explain with an example such as this particle filter in scala.

2 Feynman-Kac formulae

See Feynman-Kac.

3 System Identification in

Do not know the parameters governing the system dynamics and need to learn those too? See System identification with particle fitlers.

4 Relation to Ensemble Kalman filters

Yes.

Figure 3: Various extensions of Kalman filters as per Katzfuss, Stroud, and Wikle (2016).

See ensemble Kalman Filter.

5 Particle flow

Introduced to me by my colleague Adrian Bishop Profile.

Bunch and Godsill (2014); Daum, Huang, and Noushin (2010); Daum and Huang (2010); Daum and Huang (2009); Daum and Huang (2008); Daum and Huang (2013)

6 Non-Gaussian evolution

6.1 Jump process

I am interested in jump-process SMC (to be defined). For those I should apparently consult Graham and Méléard (1997);Grünbaum (1971);Méléard (1996);Shiga and Tanaka (1985).

6.2 On weird graphs

Christian Andersson Naesseth, Lindsten, and Schön (2014)

7 Rao-Blackwellized particle filter

Particles which represent marginalised densities (Murphy 2012, 23.6).

8 Tooling

Some MCMC toolkits incorporate SMC too.

  • particles is a python library for teaching DIY particle filtering, to accompany the book Chopin and Papaspiliopoulos (2020).
  • Johansen’s page, with C++ software
  • Dirk Eddelbuettel’s lab created RCppSMC for R integration of the Johansen stuff. Documentation is not great — it only consists of black-box toy problems without any hint of how you would construct, e.g. a likelihood function, so I can’t evaluate how easy this would be to use, as opp plain C++.
  • most probabilistic programming languages include a particle filter example.

9 References

Aldous. 2013. Interacting Particle Systems as Stochastic Social Dynamics.” Bernoulli.
Alexiadis, Alessio. 2019. Deep Multiphysics and Particle–Neuron Duality: A Computational Framework Coupling (Discrete) Multiphysics and Deep Learning.” Applied Sciences.
Alexiadis, A., Simmons, Stamatopoulos, et al. 2020. The Duality Between Particle Methods and Artificial Neural Networks.” Scientific Reports.
Ambrogioni, Guclu, and van Gerven. 2019. Wasserstein Variational Gradient Descent: From Semi-Discrete Optimal Transport to Ensemble Variational Inference.”
Andrews. 1994. Empirical Process Methods in Econometrics.” In Handbook of Econometrics.
Andrieu, Doucet, and Holenstein. 2010. Particle Markov Chain Monte Carlo Methods.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Arulampalam, Maskell, Gordon, et al. 2002. A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking.” IEEE Transactions on Signal Processing.
Ashton, Bernstein, Buchner, et al. 2022. Nested Sampling for Physical Scientists.” Nature Reviews Methods Primers.
Bretó, He, Ionides, et al. 2009. Time Series Analysis via Mechanistic Models.” The Annals of Applied Statistics.
Bunch, and Godsill. 2014. Approximations of the Optimal Importance Density Using Gaussian Particle Flow Importance Sampling.”
Cappe, Godsill, and Moulines. 2007. An Overview of Existing Methods and Recent Advances in Sequential Monte Carlo.” Proceedings of the IEEE.
Cérou, Frédéric, and Guyader. 2016. Fluctuation Analysis of Adaptive Multilevel Splitting.” The Annals of Applied Probability.
Cérou, F., Moral, Furon, et al. 2011. Sequential Monte Carlo for Rare Event Estimation.” Statistics and Computing.
Chen, and Hong. 2012. Testing for the Markov Property in Time Series.” Econometric Theory.
Chopin, and Papaspiliopoulos. 2020. An Introduction to Sequential Monte Carlo. Springer Series in Statistics.
Corenflos, Thornton, Deligiannidis, et al. 2021. Differentiable Particle Filtering via Entropy-Regularized Optimal Transport.” arXiv:2102.07850 [Cs, Stat].
Crisan, D, Del Moral, and Lyons. 1999. “Discrete Filtering Using Branching and Interacting Particle Systems.” Markov Processes and Related Fields.
Crisan, Dan, and Míguez. 2014. Particle-Kernel Estimation of the Filter Density in State-Space Models.” Bernoulli.
Daum, and Huang. 2008. Particle Flow for Nonlinear Filters with Log-Homotopy.” In Signal and Data Processing of Small Targets 2008.
———. 2009. Nonlinear Filters with Particle Flow.” In Signal and Data Processing of Small Targets 2009.
———. 2010. Generalized Particle Flow for Nonlinear Filters.” In Signal and Data Processing of Small Targets 2010.
———. 2013. Particle Flow with Non-Zero Diffusion for Nonlinear Filters.” In Signal Processing, Sensor Fusion, and Target Recognition XXII.
Daum, Huang, and Noushin. 2010. Exact Particle Flow for Nonlinear Filters.” In Signal Processing, Sensor Fusion, and Target Recognition XIX.
de Matos, and Fernandes. 2007. Testing the Markov Property with High Frequency Data.” Journal of Econometrics, Semiparametric methods in econometrics,.
Del Moral. 2004. Feynman-Kac Formulae: Genealogical and Interacting Particle Systems with Applications.
Del Moral, and Doucet. 2009. “Particle Methods: An Introduction with Applications.”
———. 2010. Interacting Markov Chain Monte Carlo Methods for Solving Nonlinear Measure-Valued Equations.” The Annals of Applied Probability.
Del Moral, Doucet, and Jasra. 2006. Sequential Monte Carlo Samplers.” Journal of the Royal Statistical Society: Series B (Statistical Methodology).
———. 2011. An Adaptive Sequential Monte Carlo Method for Approximate Bayesian Computation.” Statistics and Computing.
Del Moral, Hu, and Wu. 2011. On the Concentration Properties of Interacting Particle Processes.
Del Moral, and Lezaud. 2006. Branching and Interacting Particle Interpretations of Rare Event Probabilities.” In Stochastic Hybrid Systems. Lecture Notes in Control and Information Science, Volume 337.
Del Moral, and Miclo. 2000. Branching and Interacting Particle Systems Approximations of Feynman-Kac Formulae with Applications to Non-Linear Filtering.” In Séminaire de Probabilités XXXIV. Lecture Notes in Mathematics 1729.
Devlin, Horridge, Green, et al. 2021. “The No-U-Turn Sampler as a Proposal Distribution in a Sequential Monte Carlo Sampler with a Near-Optimal L-Kernel.”
Doucet, Freitas, and Gordon. 2001a. Sequential Monte Carlo Methods in Practice.
Doucet, Freitas, and Gordon. 2001b. An Introduction to Sequential Monte Carlo Methods.” In Sequential Monte Carlo Methods in Practice. Statistics for Engineering and Information Science.
Doucet, Godsill, and Andrieu. 2000. On Sequential Monte Carlo Sampling Methods for Bayesian Filtering.” Statistics and Computing.
Doucet, Jacob, and Rubenthaler. 2013. Derivative-Free Estimation of the Score Vector and Observed Information Matrix with Application to State-Space Models.” arXiv:1304.5768 [Stat].
Doucet, and Johansen. 2009. A Tutorial on Particle Filtering and Smoothing: Fifteen Years Later.” In Handbook of Nonlinear Filtering.
Drovandi, Pettitt, and McCutchan. 2016. Exact and Approximate Bayesian Inference for Low Integer-Valued Time Series Models with Intractable Likelihoods.” Bayesian Analysis.
Eddelbuettel, and François. 2011. Rcpp: Seamless R and C++ Integration.” Journal of Statistical Software.
Evensen. 2009. Data Assimilation - The Ensemble Kalman Filter.
Fearnhead, and Künsch. 2018. Particle Filters and Data Assimilation.” Annual Review of Statistics and Its Application.
Garbuno-Inigo, Hoffmann, Li, et al. 2020. Interacting Langevin Diffusions: Gradient Structure and Ensemble Kalman Sampler.” SIAM Journal on Applied Dynamical Systems.
Gland, and Mevel. 2000. Exponential Forgetting and Geometric Ergodicity in Hidden Markov Models.” Mathematics of Control, Signals and Systems.
Graham, and Méléard. 1997. Stochastic Particle Approximations for Generalized Boltzmann Models and Convergence Estimates.” The Annals of Probability.
Grünbaum. 1971. Propagation of Chaos for the Boltzmann Equation.” Archive for Rational Mechanics and Analysis.
Gu, Ghahramani, and Turner. 2015. Neural Adaptive Sequential Monte Carlo.” In Advances in Neural Information Processing Systems 28.
Gunawan, Dang, Quiroz, et al. 2018. Subsampling Sequential Monte Carlo for Static Bayesian Models.”
Hong, and Li. 2005. Nonparametric Specification Testing for Continuous-Time Models with Applications to Term Structure of Interest Rates.” Review of Financial Studies.
Hu, Schon, and Ljung. 2008. A Basic Convergence Result for Particle Filtering.” IEEE Transactions on Signal Processing.
Ionides, Bhadra, Atchadé, et al. 2011. Iterated Filtering.” The Annals of Statistics.
Johansen. 2009. SMCTC: Sequential Monte Carlo in C++.” Journal of Statistical Software.
Johansen, Del Moral, and Doucet. 2006. Sequential Monte Carlo Samplers for Rare Events.” In Proceedings of the 6th International Workshop on Rare Event Simulation.
Jonschkowski, Rastogi, and Brock. 2018. Differentiable Particle Filters: End-to-End Learning with Algorithmic Priors.” arXiv:1805.11122 [Cs, Stat].
Kantas, Doucet, Singh, et al. 2015. On Particle Methods for Parameter Estimation in State-Space Models.” Statistical Science.
Kappen, and Ruiz. 2016. Adaptive Importance Sampling for Control and Inference.” Journal of Statistical Physics.
Katzfuss, Stroud, and Wikle. 2016. Understanding the Ensemble Kalman Filter.” The American Statistician.
Kawamoto. 2007. Optical Flow–Driven Motion Model with Automatic Variance Adjustment for Adaptive Tracking.” In Computer Vision – ACCV 2007. Lecture Notes in Computer Science 4843.
Kim, and Mehta. 2019. An Optimal Control Derivation of Nonlinear Smoothing Equations.”
Künsch. 2005. Recursive Monte Carlo Filters: Algorithms and Theoretical Analysis.” The Annals of Statistics.
———. 2013. Particle Filters.” Bernoulli.
Kurle, Rangapuram, de Bezenac, et al. 2020. “Deep Rao-Blackwellised Particle Filters for Time Series Forecasting.” In Proceedings of the 34th International Conference on Neural Information Processing Systems. NIPS ’20.
Lai, Domke, and Sheldon. 2022. Variational Marginal Particle Filters.” In Proceedings of The 25th International Conference on Artificial Intelligence and Statistics.
Lee, and Whiteley. 2016. Variance Estimation in the Particle Filter.” arXiv:1509.00394 [Stat].
Léonard. 2014. A Survey of the Schrödinger Problem and Some of Its Connections with Optimal Transport.” Discrete & Continuous Dynamical Systems - A.
Lindsten. 2011. Rao-Blackwellised Particle Methods for Inference and Identification.”
Liu, Qiang, and Wang. 2019. Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm.” In Advances In Neural Information Processing Systems.
Liu, Chang, Zhuo, Cheng, et al. 2019. Understanding and Accelerating Particle-Based Variational Inference.” In Proceedings of the 36th International Conference on Machine Learning.
Maddison, Lawson, Tucker, et al. 2017. Filtering Variational Objectives.” arXiv Preprint arXiv:1705.09279.
Mandel. 2009. A Brief Tutorial on the Ensemble Kalman Filter.”
Maurais, and Marzouk. 2024. Sampling in Unit Time with Kernel Fisher-Rao Flow.”
Méléard. 1996. Asymptotic Behaviour of Some Interacting Particle Systems; McKean-Vlasov and Boltzmann Models.” In Probabilistic Models for Nonlinear Partial Differential Equations: Lectures Given at the 1st Session of the Centro Internazionale Matematico Estivo (C.I.M.E.) Held in Montecatini Terme, Italy, May 22–30, 1995. Lecture Notes in Mathematics.
Murphy. 2012. Machine learning: a probabilistic perspective. Adaptive computation and machine learning series.
Naesseth, Christian Andersson, Lindsten, and Schön. 2014. Sequential Monte Carlo for Graphical Models.” In Advances in Neural Information Processing Systems.
Naesseth, Christian A., Lindsten, and Schön. 2022. Elements of Sequential Monte Carlo.” arXiv:1903.04797 [Cs, Stat].
Neal. 1998. Annealed Importance Sampling.”
Noyer, Lanvin, and Benjelloun. 2004. Model-Based Tracking of 3D Objects Based on a Sequential Monte-Carlo Method.” In Conference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers, 2004.
Reich. 2019. Data Assimilation: The Schrödinger Perspective.” Acta Numerica.
Reich, and Weissmann. 2019. Fokker-Planck Particle Systems for Bayesian Inference: Computational Approaches.”
Reshef, Reshef, Sabeti, et al. 2014. Theoretical Foundations of Equitability and the Maximal Information Coefficient.” arXiv:1408.4908 [Cs, Math, q-Bio, Stat].
Roberts, and Stramer. 2001. On Inference for Partially Observed Nonlinear Diffusion Models Using the Metropolis–Hastings Algorithm.” Biometrika.
Robinson. 1983. Nonparametric Estimators for Time Series.” Journal of Time Series Analysis.
Rubinstein, and Kroese. 2016. Simulation and the Monte Carlo Method. Wiley series in probability and statistics.
Rubinstein, Ridder, and Vaisman. 2014. Fast Sequential Monte Carlo Methods for Counting and Optimization. Wiley Series in Probability and Statistics.
Runge, Donner, and Kurths. 2015. Optimal Model-Free Prediction from Multivariate Time Series.” Physical Review E.
Salomone, South, Drovandi, et al. 2018. Unbiased and Consistent Nested Sampling via Sequential Monte Carlo.”
Shiga, and Tanaka. 1985. Central Limit Theorem for a System of Markovian Particles with Mean Field Interactions.” Zeitschrift Für Wahrscheinlichkeitstheorie Und Verwandte Gebiete.
Sisson, S.A, Fan, and Tanak. 2009. Correction for Sisson Et Al., Sequential Monte Carlo Without Likelihoods.” Proceedings of the National Academy of Sciences.
Sisson, S. A., Fan, and Tanaka. 2007. Sequential Monte Carlo Without Likelihoods.” Proceedings of the National Academy of Sciences.
Taghvaei, and Mehta. 2021. An Optimal Transport Formulation of the Ensemble Kalman Filter.” IEEE Transactions on Automatic Control.
Vergé, Dubarry, Del Moral, et al. 2013. On Parallel Implementation of Sequential Monte Carlo Methods: The Island Particle Model.” Statistics and Computing.
Virrion. 2020. Deep Importance Sampling.” arXiv:2007.02692 [q-Fin].
Wang, Chen, and Li. 2021. Projected Wasserstein Gradient Descent for High-Dimensional Bayesian Inference.”
Zhao, Mair, Schön, et al. 2024. On Feynman-Kac Training of Partial Bayesian Neural Networks.” In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics.