# State filtering parameters

## Tracking things that don’t move

a.k.a. state space model calibration, recursive identification. Sometimes indistinguishable from online estimation.

State filters are cool for estimating time-varying hidden states given known fixed system parameters. How about learning those parameters of the model generating your states? Classic ways that you can do this in dynamical systems include basic linear system identification, and general system identification. But can you identify the fixed parameters (not just hidden states) with a state filter?

Yes.

According to , here are some landmark papers:

Augmenting the unobserved state vector is a well known technique, used in the system identification community for decades, see e.g. Ljung . Similar ideas, using Sequential Monte Carlos methods, were suggested by . Combined state and parameter estimation is also the standard technique for data assimilation in high-dimensional systems, see Moradkhani et al.

However, introducing random walk dynamics to the parameters with fixed variance leads to a new dynamical stochastic system with properties that may be different from the properties of the original system. That implies that the variance of the random walk should be decreased, when the method is used for offline parameter estimation, cf. .

🏗

## Iterated filtering

Related: indirect inference. Precise relation will have to wait, since I currently do not care enough about indirect inference.

## Questions

• Ionides and King dominate my citations, at least for the frequentist stuff. Surely other people use this method too? But what are the keywords? This research is suspiciously concentrated in U Michigan, but the idea is not so esoteric. I think I am caught in a citation bubble.

Update: the oceanographers, e.g. , seem to do this with Bayes a lot.

• a lot of the variational filtering literature turns out to be about attempting this with, effectively, neural nets.

• can I estimate regularisation this way, despite the lack of probabilistic interpretation? (leveraging Bayesian-prior parameter relations)

• How does this work with non-Markov systems? Do we need to bother, or can we just do the Hamiltonian trick and augment the state vector? Can we talk about mixing, or correlation decay? Should I then shoot for the new-wave mixing approaches of Kuznetsov and Mohri etc?

### Basic Construction

There are a few variations. We start with the basic continuous time state space model.

Here we have an unobserved Markov state process $$x(t)$$ on $$\mathcal{X}$$ and an observation process $$y(t)$$ on $$\mathcal{Y}$$. For now they will be assumed to be finite dimensional vectors over $$\mathbb{R}.$$ They will additionally depend upon a vector of parameters $$\theta$$ We observe the process at discrete times $$t(1:T)=(t_1, t_2,\dots, t_T),$$ and we write the observations $$y(1:T)=(y(t_1), y(t_2),\dots, y(1_T)).$$

We presume our processes are completely specified by the following conditional densities (which might not have closed-form expression)

The transition density

$f(x(t_i)|x(t_{i-1}), \theta)$

The observation density…

TBC.

## Implementations

pomp does state filtering inference in R.

For some example of doing this in Stan see Sinhrks’ stan-statespace.

## Incoming

Recently enjoyed: Sahani Pathiraja’s state filter does something cool, in attempting to identify process model noise — a conditional nonparametric density of process errors, that may be used to come up with some neat process models. I’m not convinced about her use of kernel density estimation, since these scale badly precisely when you need them most, in high dimension; but any nonparametric density estimator would, I assume, work, and that would be awesome.

## References

Andrieu, Christophe, Arnaud Doucet, and Roman Holenstein. 2010. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 72 (3): 269–342.
Archer, Evan, Il Memming Park, Lars Buesing, John Cunningham, and Liam Paninski. 2015. arXiv:1511.07367 [Stat], November.
Babtie, Ann C., Paul Kirk, and Michael P. H. Stumpf. 2014. Proceedings of the National Academy of Sciences 111 (52): 18507–12.
Bamler, Robert, and Stephan Mandt. 2017. arXiv:1707.01069 [Cs, Stat], July.
Becker, Philipp, Harit Pandya, Gregor Gebhardt, Cheng Zhao, C. James Taylor, and Gerhard Neumann. 2019. In International Conference on Machine Learning, 544–52.
Bretó, Carles, Daihai He, Edward L. Ionides, and Aaron A. King. 2009. The Annals of Applied Statistics 3 (1): 319–48.
Brunton, Steven L., Joshua L. Proctor, and J. Nathan Kutz. 2016. Proceedings of the National Academy of Sciences 113 (15): 3932–37.
Chung, Junyoung, Kyle Kastner, Laurent Dinh, Kratarth Goel, Aaron C Courville, and Yoshua Bengio. 2015. In Advances in Neural Information Processing Systems 28, edited by C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, 2980–88. Curran Associates, Inc.
Del Moral, Pierre, Arnaud Doucet, and Ajay Jasra. 2006. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 68 (3): 411–36.
———. 2011. Statistics and Computing 22 (5): 1009–20.
Doucet, Arnaud, Nando Freitas, and Neil Gordon. 2001. Sequential Monte Carlo Methods in Practice. New York, NY: Springer New York.
Doucet, Arnaud, Pierre E. Jacob, and Sylvain Rubenthaler. 2013. arXiv:1304.5768 [Stat], April.
Drovandi, Christopher C., Anthony N. Pettitt, and Roy A. McCutchan. 2016. Bayesian Analysis 11 (2): 325–52.
Durbin, J., and S. J. Koopman. 2012. Time Series Analysis by State Space Methods. 2nd ed. Oxford Statistical Science Series 38. Oxford: Oxford University Press.
Evensen, G. 2009. IEEE Control Systems 29 (3): 83–104.
Evensen, Geir. 2003. Ocean Dynamics 53 (4): 343–67.
———. 2009. Data Assimilation - The Ensemble Kalman Filter. Berlin; Heidelberg: Springer.
Fearnhead, Paul, and Hans R. Künsch. 2018. Annual Review of Statistics and Its Application 5 (1): 421–49.
He, Daihai, Edward L. Ionides, and Aaron A. King. 2010. Journal of The Royal Society Interface 7 (43): 271–83.
Heinonen, Markus, and Florence d’Alché-Buc. 2014. arXiv:1411.5172 [Cs, Stat], November.
Hürzeler, Markus, and Hans R. Künsch. 2001. In Sequential Monte Carlo Methods in Practice, 159–75. Statistics for Engineering and Information Science. Springer, New York, NY.
Ingraham, John, and Debora Marks. 2017. In PMLR, 1607–16.
Ionides, E. L., C. Bretó, and A. A. King. 2006. Proceedings of the National Academy of Sciences 103 (49): 18438–43.
Ionides, Edward L., Anindya Bhadra, Yves Atchadé, and Aaron King. 2011. The Annals of Statistics 39 (3): 1776–1802.
Ionides, Edward L., Dao Nguyen, Yves Atchadé, Stilian Stoev, and Aaron A. King. 2015. Proceedings of the National Academy of Sciences 112 (3): 719–24.
Kantas, N., A. Doucet, S. S. Singh, and J. M. Maciejowski. 2009. IFAC Proceedings Volumes, 15th IFAC Symposium on System Identification, 42 (10): 774–85.
Kantas, Nikolas, Arnaud Doucet, Sumeetpal S. Singh, Jan Maciejowski, and Nicolas Chopin. 2015. Statistical Science 30 (3): 328–51.
Kitagawa, Genshiro. 1998. Journal of the American Statistical Association, 1203–15.
Krishnan, Rahul G., Uri Shalit, and David Sontag. 2015. arXiv Preprint arXiv:1511.05121.
———. 2017. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, 2101–9.
Le, Tuan Anh, Maximilian Igl, Tom Jin, Tom Rainforth, and Frank Wood. 2017. arXiv Preprint arXiv:1705.10306.
Lele, S. R., B. Dennis, and F. Lutscher. 2007. Ecology Letters 10 (7): 551.
Lele, Subhash R., Khurram Nadeem, and Byron Schmuland. 2010. Journal of the American Statistical Association 105 (492): 1617–25.
Lindström, Erik, Edward Ionides, Jan Frydendall, and Henrik Madsen. 2012. In IFAC-PapersOnLine (System Identification, Volume 16), 45:1785–90. 16th IFAC Symposium on System Identification. IFAC & Elsevier Ltd.
Lindström, Erik, Jonas Ströjby, Mats Brodén, Magnus Wiktorsson, and Jan Holst. 2008. Computational Statistics & Data Analysis 52 (6): 2877–91.
Liu, Jane, and Mike West. 2001. In Sequential Monte Carlo Methods in Practice, 197–223. Statistics for Engineering and Information Science. Springer, New York, NY.
Ljung, L. 1979. IEEE Transactions on Automatic Control 24 (1): 36–50.
Ljung, Lennart, Georg Ch Pflug, and Harro Walk. 2012. Stochastic Approximation and Optimization of Random Systems. Vol. 17. Birkhäuser.
Ljung, Lennart, and Torsten Söderström. 1983. Theory and Practice of Recursive Identification. The MIT Press Series in Signal Processing, Optimization, and Control 4. Cambridge, Mass: MIT Press.
Maddison, Chris J., Dieterich Lawson, George Tucker, Nicolas Heess, Mohammad Norouzi, Andriy Mnih, Arnaud Doucet, and Yee Whye Teh. 2017. arXiv Preprint arXiv:1705.09279.
Moradkhani, Hamid, Soroosh Sorooshian, Hoshin V. Gupta, and Paul R. Houser. 2005. Advances in Water Resources 28 (2): 135–47.
Naesseth, Christian A., Scott W. Linderman, Rajesh Ranganath, and David M. Blei. 2017. arXiv Preprint arXiv:1705.11140.
Oliva, Junier B., Barnabas Poczos, and Jeff Schneider. 2017. arXiv:1703.00381 [Cs, Stat], March.
Sjöberg, Jonas, Qinghua Zhang, Lennart Ljung, Albert Benveniste, Bernard Delyon, Pierre-Yves Glorennec, Håkan Hjalmarsson, and Anatoli Juditsky. 1995. Automatica, Trends in System Identification, 31 (12): 1691–1724.
Söderström, T., and P. Stoica, eds. 1988. System Identification. Upper Saddle River, NJ, USA: Prentice-Hall, Inc.
Tallec, Corentin, and Yann Ollivier. 2017. arXiv:1705.08209 [Cs], May.
Tippett, Michael K., Jeffrey L. Anderson, Craig H. Bishop, Thomas M. Hamill, and Jeffrey S. Whitaker. 2003. Monthly Weather Review 131 (7): 1485–90.
Werbos, Paul J. 1988. Neural Networks 1 (4): 339–56.

### No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.