Stability in dynamical systems

Lyapunov exponents and ilk



A placeholder.

Informally, I am admitting as β€œstable” any dynamical system which does not explode super-polynomially fast; We can think of these as systems where if the system is not stationary then at least the rate of change might be.

Here I would like to think how to parameterize stable systems, and how to discover if systems are stable. This in a general context, which can be extremely hard in interesting systems. But often stability questions can be simpler in the context of linear systems, where is is about polynomial root-finding (perversely, polynomial root-finding is itself a famously chaotic system).

In a general setting should probably look at stuff like Lyapunov exponents to quantify stability.

Interesting connections here β€” we can also think about the relationships between stability and ergodicity, and criticality. Considering stability of neural networks turns out to produce some nice ideas.

References

Berkenkamp, Felix, and Angela P. Schoellig. 2015. β€œSafe and Robust Learning Control with Gaussian Processes.” In 2015 European Control Conference (ECC), 2496–2501. Linz, Austria: IEEE.
Chang, Bo, Lili Meng, Eldad Haber, Lars Ruthotto, David Begert, and Elliot Holtham. 2018. β€œReversible Architectures for Arbitrarily Deep Residual Neural Networks.” In arXiv:1709.03698 [Cs, Stat].
Gu, Albert, Isys Johnson, Karan Goel, Khaled Saab, Tri Dao, Atri Rudra, and Christopher RΓ©. 2021. β€œCombining Recurrent, Convolutional, and Continuous-Time Models with Linear State Space Layers.” In Advances in Neural Information Processing Systems, 34:572–85. Curran Associates, Inc.
Haddad, Wassim M, and VijaySekhar Chellaboina. 2011. Nonlinear Dynamical Systems and Control: a Lyapunov-Based Approach. Princeton: Princeton University Press.
Lawrence, Nathan, Philip Loewen, Michael Forbes, Johan Backstrom, and Bhushan Gopaluni. 2020. β€œAlmost Surely Stable Deep Dynamics.” In Advances in Neural Information Processing Systems. Vol. 33.
Mohammed, Salah-Eldin A., and Michael K. R. Scheutzow. 1997. β€œLyapunov Exponents of Linear Stochastic Functional-Differential Equations. II. Examples and Case Studies.” The Annals of Probability 25 (3): 1210–40.
Pathak, Jaideep, Zhixin Lu, Brian R. Hunt, Michelle Girvan, and Edward Ott. 2017. β€œUsing Machine Learning to Replicate Chaotic Attractors and Calculate Lyapunov Exponents from Data.” Chaos: An Interdisciplinary Journal of Nonlinear Science 27 (12): 121102.
Roberts, Daniel A., Sho Yaida, and Boris Hanin. 2021. β€œThe Principles of Deep Learning Theory.” arXiv:2106.10165 [Hep-Th, Stat], August.
Smith, Leonard A. 2000. β€œDisentangling Uncertainty and Error: On the Predictability of Nonlinear Systems.” In Nonlinear Dynamics and Statistics.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.