Reservoir Computing



I am familiar with reservoir computing only from random NNs, but there is a whole weird literature on training reservoir models that lives in a weird mirror world to classic gradient-descent-trained predictive models, that seems to lean more heavily on the statistical mechanics of statistics than I am accustomed to.

Gauthier et al. (2021) introduces modern training methods for these things: Scientists develop the next generation of reservoir computing.

Kim and Bassett (2022) shows how to program them:

we turn the analogy between neural computation and silicon computation into a concrete reality by programming fundamental constructs from computer science into reservoir computers. First, we extend the idea of static memory in silicon computers to program chaotic dynamical systems as random access memories (dRAM). Second, because RCs can store dynamical systems as memories, and the RC itself is a dynamical system, we demonstrate that a host RC can virtualize the time-evolution of a guest RC, precisely as a host silicon computer can create a virtual machine of a guest computer. Third, we provide a concrete implementation of a fully neural logical calculus by programming RCs to evolve as the logic gates and, nand, or, nor, xor, and xnor, and construct neural implemen- tations of logic circuits such as a binary adder, flip-flop latch, and multivibrator circuit. Finally, we define a simple scheme for software and game development on RC architectures by programming an RC to simulate a variant of the game “pong.” Through this language, we define a concrete, practical, and fully generalizable implementation of neural computation

References

Gauthier, Daniel J., Erik Bollt, Aaron Griffith, and Wendson A. S. Barbosa. 2021. Next Generation Reservoir Computing.” Nature Communications 12 (1): 5564.
Goudarzi, Alireza, Peter Banda, Matthew R. Lakin, Christof Teuscher, and Darko Stefanovic. 2014. A Comparative Study of Reservoir Computing for Temporal Signal Processing.” arXiv:1401.2224 [Cs], January.
Goudarzi, Alireza, and Christof Teuscher. 2016. Reservoir Computing: Quo Vadis? In Proceedings of the 3rd ACM International Conference on Nanoscale Computing and Communication, 13:1–6. NANOCOM’16. New York, NY, USA: ACM.
Kim, Jason Z., and Dani S. Bassett. 2022. A Neural Programming Language for the Reservoir Computer.” arXiv:2203.05032 [Cond-Mat, Physics:nlin], March.
Lukoševičius, Mantas, and Herbert Jaeger. 2009. Reservoir Computing Approaches to Recurrent Neural Network Training.” Computer Science Review 3 (3): 127–49.
Pathak, Jaideep, Brian Hunt, Michelle Girvan, Zhixin Lu, and Edward Ott. 2018. Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach.” Physical Review Letters 120 (2): 024102.
Pathak, Jaideep, Zhixin Lu, Brian R. Hunt, Michelle Girvan, and Edward Ott. 2017. Using Machine Learning to Replicate Chaotic Attractors and Calculate Lyapunov Exponents from Data.” Chaos: An Interdisciplinary Journal of Nonlinear Science 27 (12): 121102.
Triefenbach, F., A. Jalalvand, K. Demuynck, and J. P. Martens. 2013. Acoustic Modeling With Hierarchical Reservoirs.” IEEE Transactions on Audio, Speech, and Language Processing 21 (11): 2439–50.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.