State space reconstruction



Disclaimer: I know next to nothing about this.

But I think it’s something like: Looking at the data from a, possibly stochastic, dynamical system. and hoping to infer cool things about the kinds of hidden states it has, in some general sense, such as some measure of statistical of computational complexity, or how complicated or “large” the underlying state space, in some convenient representation, is.

TBH I don’t understand this framing, but possibly because I don’t come from a dynamical systems group; I just dabble in special cases thereof. Surely you either do physics, and work out the dynamics of your system from experiment, or you do statistics and select an appropriate model to minimise some estimated predictive loss trading off data set, model complexity and algorithmic complexity. I need to read more to understand the rationale here, clearly.

Anyway, tools seem to include inventing large spaces of hidden states (Takens embedding); does this get us some nice algebraic properties? Also, how does delay embedding relate? Is that the same? Sample complexity results seem to be scanty, possibly because they usually want their chaos to be deterministic and admitting noise would be fiddly.

OTOH, from a statistical perspective there are lots of useful techniques to infer special classes of dynamical systems state-space. It is especially interesting inmgrammatical inference of formal syntax where there are many lovely and faintly depressing computational complexity results.

Stuff that I might actually use

Hirata’s reconstruction looks like good clean decorative fun — you can represent graphs by an equivalent dynamical system.

References

Badii, Remo, and Antonio Politi. 1999. Complexity: Hierarchical Structures and Scaling in Physics. Cambridge Nonlinear Science Series. Cambridge University Press.
Brunton, Steven L., Joshua L. Proctor, and J. Nathan Kutz. 2016. Discovering Governing Equations from Data by Sparse Identification of Nonlinear Dynamical Systems.” Proceedings of the National Academy of Sciences 113 (15): 3932–37.
Charles, Adam, Aurele Balavoine, and Christopher Rozell. 2016. Dynamic Filtering of Time-Varying Sparse Signals via L1 Minimization.” IEEE Transactions on Signal Processing 64 (21): 5644–56.
Chen, Boyuan, Kuang Huang, Sunand Raghupathi, Ishaan Chandratreya, Qiang Du, and Hod Lipson. 2022. Automated Discovery of Fundamental Variables Hidden in Experimental Data.” Nature Computational Science 2 (7): 433–42.
Crutchfield, James P., and Karl Young. 1989. Inferring Statistical Complexity.” Physical Review Letters 63 (2): 105–8.
Dupont, P, François Denis, and Y Esposito. 2005. Links Between Probabilistic Automata and Hidden Markov Models: Probability Distributions, Learning Models and Induction Algorithms.” Pattern Recognition 38: 1349–71.
Foote, Jonathan. 1999. Visualizing Music and Audio Using Self-Similarity.” In Proceedings of the Seventh ACM International Conference on Multimedia (Part 1), 77–80. MULTIMEDIA ’99. New York, NY, USA: ACM.
Grassberger, Peter, Thomas Schreiber, and C Schaffrath. 1991. Nonlinear Time Sequence Analysis.” International Journal of Bifurcation and Chaos 1 (3): 521–47.
Hamilton, Franz, Tyrus Berry, and Timothy Sauer. 2016. Kalman-Takens Filtering in the Presence of Dynamical Noise.” arXiv:1611.05414 [Physics, Stat], November.
Hirata, Yoshito, Shunsuke Horai, and Kazuyuki Aihara. 2008. Reproduction of Distance Matrices and Original Time Series from Recurrence Plots and Their Applications.” The European Physical Journal Special Topics 164 (1): 13–22.
Hirata, Yoshito, Hideyuki Suzuki, and Kazuyuki Aihara. 2006. Reconstructing State Spaces from Multivariate Data Using Variable Delays.” Physical Review E 74 (2): 026202.
Kantz, Holger, and Thomas Schreiber. 2004. Nonlinear Time Series Analysis. 2nd ed. Cambridge, UK ; New York: Cambridge University Press.
Levin, David N. 2017. The Inner Structure of Time-Dependent Signals.” arXiv:1703.08596 [Cs, Math, Stat], March.
Marwan, N. 2008. A Historical Review of Recurrence Plots.” The European Physical Journal Special Topics 164 (1): 3–12.
Shalizi, Cosma Rohilla, and James P Crutchfield. 2000. “Computational Mechanics: Pattern and Prediction, Structure and Simplicity.”
Shalizi, Cosma Rohilla, and James P. Crutchfield. 2002. Information Bottlenecks, Causal States, and Statistical Relevance Bases: How to Represent Relevant Information in Memoryless Transduction.” Advances in Complex Systems 05 (01): 91–95.
Shalizi, Cosma Rohilla, and Kristina Lisa Shalizi. 2004. “Blind Construction of Optimal Nonlinear Recursive Predictors for Discrete Sequences.” In, 504–11.
Sugihara, George, Robert May, Hao Ye, Chih-hao Hsieh, Ethan Deyle, Michael Fogarty, and Stephan Munch. 2012. Detecting Causality in Complex Ecosystems.” Science 338 (6106): 496–500.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.