Stability in dynamical systems

Lyapunov exponents and ilk

May 21, 2019 — February 22, 2022

dynamical systems
functional analysis
probability
statistics

A placeholder.

Informally, I am admitting as “stable” any dynamical system that does not explode super-polynomially fast; we can think of these as systems where if the system is not stationary, then at least the rate of change might be.

Here I would like to think about how to parameterize stable systems and how to discover if systems are stable. This is in a general context, which can be extremely hard in interesting systems. But often stability questions can be simpler in the context of linear systems, which is about polynomial root-finding (perversely, polynomial root-finding is itself a famously chaotic system).

In a general setting, we should probably look at stuff like Lyapunov exponents to quantify stability.

Figure 1

Interesting connections here — we can also think about the relationships between stability and ergodicity, and criticality. Considering the stability of neural networks turns out to produce some nice ideas.

1 References

Berkenkamp, and Schoellig. 2015. Safe and Robust Learning Control with Gaussian Processes.” In 2015 European Control Conference (ECC).
Chang, Meng, Haber, et al. 2018. Reversible Architectures for Arbitrarily Deep Residual Neural Networks.” In arXiv:1709.03698 [Cs, Stat].
Gu, Johnson, Goel, et al. 2021. Combining Recurrent, Convolutional, and Continuous-Time Models with Linear State Space Layers.” In Advances in Neural Information Processing Systems.
Haddad, and Chellaboina. 2011. Nonlinear Dynamical Systems and Control: a Lyapunov-Based Approach.
Lawrence, Loewen, Forbes, et al. 2020. Almost Surely Stable Deep Dynamics.” In Advances in Neural Information Processing Systems.
Mohammed, and Scheutzow. 1997. Lyapunov Exponents of Linear Stochastic Functional-Differential Equations. II. Examples and Case Studies.” The Annals of Probability.
Pathak, Lu, Hunt, et al. 2017. Using Machine Learning to Replicate Chaotic Attractors and Calculate Lyapunov Exponents from Data.” Chaos: An Interdisciplinary Journal of Nonlinear Science.
Roberts, Yaida, and Hanin. 2021. The Principles of Deep Learning Theory.” arXiv:2106.10165 [Hep-Th, Stat].
Smith. 2000. “Disentangling Uncertainty and Error: On the Predictability of Nonlinear Systems.” In Nonlinear Dynamics and Statistics.