Reservoir Computing

March 28, 2022 — March 28, 2022

dynamical systems
feature construction
machine learning
networks
neural nets
probabilistic algorithms
statmech
stochastic processes
Figure 1

I am familiar with reservoir computing only from random NNs, but there is a whole weird literature on training reservoir models that lives in a weird mirror world to classic gradient-descent-trained predictive models. It seems to lean more heavily on the statistical mechanics of statistics than I am accustomed to.

Gauthier et al. (2021) introduces modern training methods for these things: Scientists develop the next generation of reservoir computing.

Kim and Bassett (2022) shows how to program them:

we turn the analogy between neural computation and silicon computation into a concrete reality by programming fundamental constructs from computer science into reservoir computers. First, we extend the idea of static memory in silicon computers to program chaotic dynamical systems as random access memories (dRAM). Second, because RCs can store dynamical systems as memories, and the RC itself is a dynamical system, we demonstrate that a host RC can virtualize the time-evolution of a guest RC, precisely as a host silicon computer can create a virtual machine of a guest computer. Third, we provide a concrete implementation of a fully neural logical calculus by programming RCs to evolve as the logic gates and, nand, or, nor, xor, and xnor, and construct neural implementations of logic circuits such as a binary adder, flip-flop latch, and multivibrator circuit. Finally, we define a simple scheme for software and game development on RC architectures by programming an RC to simulate a variant of the game “pong.” Through this language, we define a concrete, practical, and fully generalizable implementation of neural computation

1 References

Gauthier, Bollt, Griffith, et al. 2021. Next Generation Reservoir Computing.” Nature Communications.
Goudarzi, Banda, Lakin, et al. 2014. A Comparative Study of Reservoir Computing for Temporal Signal Processing.” arXiv:1401.2224 [Cs].
Goudarzi, and Teuscher. 2016. Reservoir Computing: Quo Vadis? In Proceedings of the 3rd ACM International Conference on Nanoscale Computing and Communication. NANOCOM’16.
Kim, and Bassett. 2022. A Neural Programming Language for the Reservoir Computer.” arXiv:2203.05032 [Cond-Mat, Physics:nlin].
Lukoševičius, and Jaeger. 2009. Reservoir Computing Approaches to Recurrent Neural Network Training.” Computer Science Review.
Pathak, Hunt, Girvan, et al. 2018. Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach.” Physical Review Letters.
Pathak, Lu, Hunt, et al. 2017. Using Machine Learning to Replicate Chaotic Attractors and Calculate Lyapunov Exponents from Data.” Chaos: An Interdisciplinary Journal of Nonlinear Science.
Triefenbach, Jalalvand, Demuynck, et al. 2013. Acoustic Modeling With Hierarchical Reservoirs.” IEEE Transactions on Audio, Speech, and Language Processing.