Variational state filtering

March 19, 2018 — December 8, 2021

Bayes
dynamical systems
linear algebra
optimization
probability
signal processing
state space models
statistics
time series
Figure 1

A placeholder to discuss state filtering and parameter estimation where the unobserved state is quantified by variationally-learned distributions.

Campbell et al. (2021) introduce an elegant method which also performs system identification. I would like to have time to go into more detail about this but for now I will present the key insight without adequate explanation for my own benefit. The neat trick is that the variational approximation is in a sense global, in that it all telescopes into one big variational approximation, rather than a sequence of successive approximations, each of which accumulates a greater error inside the ELBO. Intuitively this gives us more hope that we are can avoid accumulating bias at each filter step.

\[ \max _{\theta, \phi} \mathcal{L}_{t}(\theta, \phi)=\mathbb{E}_{q_{t}^{\phi}\left(x_{1: t}\right)}\left[\log \frac{p_{\theta}\left(x_{1: t}, y^{t}\right)}{q_{t}^{\phi}\left(x_{1: t}\right)}\right] \]

Our key factorization: \(q_{t}^{\phi}\left(x_{1: t}\right)=q_{t}^{\phi}\left(x_{t}\right) q_{t}^{\phi}\left(x_{t-1} \mid x_{t}\right) q_{t-1}^{\phi}\left(x_{t-2} \mid x_{t-1}\right) \ldots q_{2}^{\phi}\left(x_{1} \mid x_{2}\right)\)

True factorization: \(p_{\theta}\left(x_{t} \mid y^{t}\right) p_{\theta}\left(x_{t-1} \mid x_{t}, y^{t-1}\right) p_{\theta}\left(x_{t-2} \mid x_{t-1}, y^{t-2}\right) \cdots p_{\theta}\left(x_{1} \mid x_{2}, y^{1}\right)\)

1 References

Archer, Park, Buesing, et al. 2015. Black Box Variational Inference for State Space Models.” arXiv:1511.07367 [Stat].
Bannister. 2017. A Review of Operational Methods of Variational and Ensemble‐variational Data Assimilation.” Quarterly Journal of the Royal Meteorological Society.
Bayer, and Osendorfer. 2014. Learning Stochastic Recurrent Networks.” arXiv:1411.7610 [Cs, Stat].
Campbell, Shi, Rainforth, et al. 2021. Online Variational Filtering and Parameter Learning.” In.
Chung, Kastner, Dinh, et al. 2015. A Recurrent Latent Variable Model for Sequential Data.” In Advances in Neural Information Processing Systems 28.
Cox, van de Laar, and de Vries. 2019. A Factor Graph Approach to Automated Design of Bayesian Signal Processing Algorithms.” International Journal of Approximate Reasoning.
Damianou, Titsias, and Lawrence. 2011. Variational Gaussian Process Dynamical Systems.” In Advances in Neural Information Processing Systems 24.
de Freitas, Niranjan, Gee, et al. 1998. “Sequential Monte Carlo Methods for Optimisation of Neural Network Models.” Cambridge University Engineering Department, Cambridge, England, Technical Report TR-328.
Doerr, Daniel, Schiegg, et al. 2018. Probabilistic Recurrent State-Space Models.” arXiv:1801.10395 [Stat].
Drovandi, Pettitt, and McCutchan. 2016. Exact and Approximate Bayesian Inference for Low Integer-Valued Time Series Models with Intractable Likelihoods.” Bayesian Analysis.
Eleftheriadis, Nicholson, Deisenroth, et al. 2017. Identification of Gaussian Process State Space Models.” In Advances in Neural Information Processing Systems 30.
Fabius, and van Amersfoort. 2014. Variational Recurrent Auto-Encoders.” In Proceedings of ICLR.
Föll, Haasdonk, Hanselmann, et al. 2017. Deep Recurrent Gaussian Process with Variational Sparse Spectrum Approximation.” arXiv:1711.00799 [Stat].
Fortunato, Blundell, and Vinyals. 2017. Bayesian Recurrent Neural Networks.” arXiv:1704.02798 [Cs, Stat].
Fraccaro, Sø nderby, Paquet, et al. 2016. Sequential Neural Models with Stochastic Layers.” In Advances in Neural Information Processing Systems 29.
Frerix, Kochkov, Smith, et al. 2021. Variational Data Assimilation with a Learned Inverse Observation Operator.” In.
Frigola, Chen, and Rasmussen. 2014. Variational Gaussian Process State-Space Models.” In Advances in Neural Information Processing Systems 27.
Frigola, Lindsten, Schön, et al. 2013. Bayesian Inference and Learning in Gaussian Process State-Space Models with Particle MCMC.” In Advances in Neural Information Processing Systems 26.
Friston. 2008. Variational Filtering.” NeuroImage.
Gorad, Zhao, and Särkkä. 2020. “Parameter Estimation in Non-Linear State-Space Models by Automatic Differentiation of Non-Linear Kalman Filters.” In.
Gu, Ghahramani, and Turner. 2015. Neural Adaptive Sequential Monte Carlo.” In Advances in Neural Information Processing Systems 28.
Hoffman, Blei, Wang, et al. 2013. Stochastic Variational Inference.” arXiv:1206.7051 [Cs, Stat].
Hsu, Zhang, and Glass. 2017. Unsupervised Learning of Disentangled and Interpretable Representations from Sequential Data.” In arXiv:1709.07902 [Cs, Eess, Stat].
Karl, Soelch, Bayer, et al. 2016. Deep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data.” In Proceedings of ICLR.
Kocijan, Girard, Banko, et al. 2005. Dynamic Systems Identification with Gaussian Processes.” Mathematical and Computer Modelling of Dynamical Systems.
Ko, and Fox. 2009. GP-BayesFilters: Bayesian Filtering Using Gaussian Process Prediction and Observation Models.” In Autonomous Robots.
Krishnan, Shalit, and Sontag. 2015. Deep Kalman Filters.” arXiv Preprint arXiv:1511.05121.
———. 2017. Structured Inference Networks for Nonlinear State Space Models.” In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence.
Kulhavý. 1990. Recursive Nonlinear Estimation: A Geometric Approach.” Automatica.
Lai, Domke, and Sheldon. 2022. Variational Marginal Particle Filters.” In Proceedings of The 25th International Conference on Artificial Intelligence and Statistics.
Le, Igl, Jin, et al. 2017. Auto-Encoding Sequential Monte Carlo.” arXiv Preprint arXiv:1705.10306.
Ljung. 1998. System Identification.” In Signal Analysis and Prediction. Applied and Numerical Harmonic Analysis.
Loeliger, Dauwels, Hu, et al. 2007. The Factor Graph Approach to Model-Based Signal Processing.” Proceedings of the IEEE.
Louizos, and Welling. 2016. Structured and Efficient Variational Deep Learning with Matrix Gaussian Posteriors.” In arXiv Preprint arXiv:1603.04733.
Maddison, Lawson, Tucker, et al. 2017. Filtering Variational Objectives.” arXiv Preprint arXiv:1705.09279.
Mattos, Dai, Damianou, et al. 2016. Recurrent Gaussian Processes.” In Proceedings of ICLR.
Mattos, Dai, Damianou, et al. 2017. Deep Recurrent Gaussian Processes for Outlier-Robust System Identification.” Journal of Process Control, DYCOPS-CAB 2016,.
Naesseth, Linderman, Ranganath, et al. 2017. Variational Sequential Monte Carlo.” arXiv Preprint arXiv:1705.11140.
Ranganath, Tran, Altosaar, et al. 2016. Operator Variational Inference.” In Advances in Neural Information Processing Systems 29.
Ranganath, Tran, and Blei. 2016. Hierarchical Variational Models.” In PMLR.
Reller. 2013. State-Space Methods in Statistical Signal Processing: New Ideas and Applications.” Application/pdf.
Rozet, and Louppe. 2023. Score-Based Data Assimilation.”
Ryder, Golightly, McGough, et al. 2018. Black-Box Variational Inference for Stochastic Differential Equations.” arXiv:1802.03335 [Stat].
Särkkä, S., and Hartikainen. 2013. Non-Linear Noise Adaptive Kalman Filtering via Variational Bayes.” In 2013 IEEE International Workshop on Machine Learning for Signal Processing (MLSP).
Särkkä, Simo, and Nummenmaa. 2009. Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations.” IEEE Transactions on Automatic Control.
Schmidt, Krämer, and Hennig. 2021. A Probabilistic State Space Model for Joint Inference from Differential Equations and Data.” arXiv:2103.10153 [Cs, Stat].
Titsias, and Lawrence. 2010. Bayesian Gaussian Process Latent Variable Model.” In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics.
Turner, Deisenroth, and Rasmussen. 2010. State-Space Inference and Learning with Gaussian Processes.” In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics.