# Stability in linear dynamical systems

This Bodes well

July 19, 2019 — February 16, 2021

The intersection of linear dynamical systems and stability of dynamic systems.

Related: detecting non-stationarity.

There is not much content here because I spent 2 years working on it and am too traumatised to revisit it.

Informally, I am admitting as “stable” any dynamical system which does not explode super-polynomially fast; We can think of these as systems where if the system is not stationary then at least the rate of change might be.

Energy-preserving systems are a special case of this.

There are many problems I am interested in that touch upon this.

## 1 Pole representations

In the univariate, discrete-time case, in discrete-time linear systems terms, these are systems that have no poles outside the unit circle, but might have poles *on* the unit circle. In continuous time it is about systems that have no poles with positive real part. For finitely realizable systems this boils down to tracking trigonometric roots, e.g. Megretski (2003).

In a multivariate context we might consider eigenvalues of the transfer matrix in a similar light.

van Handel (2017) for example mention the standard result that the eigenvalues of a symmetric matrix \(X\) are the roots of the characteristic polynomial \(\chi(t)=\operatorname{det}(t I-X)\) and, equivalently, the poles of the Stieltjes transform \(s(t):=\operatorname{Tr}\left[(t I-X)^{-1}\right]=\frac{d}{d t} \log \chi(t)\)

## 2 Reparameterisation

We can use cunning reparameterisation to keep systems stable. This Betancourt podcast on Sarah Heaps’ paper (Heaps 2020) on parameterising stationarity in vector auto regressions is deep and IMO points the way to some other neat tricks in neural nets. She constructs interesting priors for this case, using some reparametrisations by Ansley and Kohn (1986).

Maybe related: Roy, Mcelroy, and Linton (2019)

## 3 Continuous time

TBC.

## 4 Stability and gradient descent

What if we are incrementally learning a system and wish the gradient descent steps not to push it away from stability? In such a case, we can possibly side-step the problem by using a topology which maximises system stability (Laroche 2007).

## 5 References

*Proceedings of the 29th International Coference on International Conference on Machine Learning*. ICML’12.

*Proceedings of the 39th International Conference on Machine Learning*.

*Journal of Statistical Computation and Simulation*.

*SIAM Journal on Control and Optimization*.

*IEEE Transactions on Signal Processing*.

*Proceedings of the 31st International Conference on Machine Learning*.

*SIAM Journal on Control and Optimization*.

*Positive trigonometric polynomials and signal processing applications*. Signals and communication technology.

*Annals of Mathematics*.

*Advances in Neural Information Processing Systems*.

*The Journal of Machine Learning Research*.

*arXiv:2004.09455 [Stat]*.

*Algorithmic Learning Theory*. Lecture Notes in Computer Science.

*Journal of the Audio Engineering Society*.

*JMLR*.

*IEEE Signal Processing Magazine*.

*Proceedings of the Royal Society of London*.

*42nd IEEE International Conference on Decision and Control (IEEE Cat. No.03CH37475)*.

*Perspectives in Robust Control*. Lecture Notes in Control and Information Sciences.

*SIAM Review*.

*Statistica Sinica*.

*Automatica*.

*arXiv:1802.08334 [Cs, Math, Stat]*.

*Journal of Physics A: Mathematical and General*.

*Convexity and Concentration*. The IMA Volumes in Mathematics and Its Applications.