Memory in machine learning

March 3, 2021 — March 3, 2021

language
machine learning
neural nets
NLP
Figure 1

How best should learning mechanisms store and retrieve memories? Important in reinforcement learnt Implicit in recurrent networks. One of the chief advantages of neural Turing machines. A great apparent success of transformers.

But, as my colleague Tom Blau points out, perhaps best considered as a topic in its owen right.

1 References

Charles, Yin, and Rozell. 2016. Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks.” arXiv:1605.08346 [Cs, Math, Stat].
Gordo, Almazan, Revaud, et al. 2016. End-to-End Learning of Deep Visual Representations for Image Retrieval.” arXiv:1610.07940 [Cs].
Graves, Wayne, Reynolds, et al. 2016. Hybrid Computing Using a Neural Network with Dynamic External Memory.” Nature.
Grefenstette, Hermann, Suleyman, et al. 2015. Learning to Transduce with Unbounded Memory.” arXiv:1506.02516 [Cs].
Hochreiter, and Schmidhuber. 1997. Long Short-Term Memory.” Neural Computation.
Munkhdalai, Sordoni, Wang, et al. 2019. Metalearned Neural Memory.” In Advances In Neural Information Processing Systems.
Nagathan, Mungara, and Manimozhi. 2014. Content-Based Image Retrieval System Using Feed-Forward Backpropagation Neural Network.” International Journal of Computer Science and Network Security (IJCSNS).
Patraucean, Handa, and Cipolla. 2015. Spatio-Temporal Video Autoencoder with Differentiable Memory.” arXiv:1511.06309 [Cs].
Perez, and Liu. 2016. Gated End-to-End Memory Networks.” arXiv:1610.04211 [Cs, Stat].
Voelker, Kajic, and Eliasmith. n.d. “Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks.”
Weiss, Goldberg, and Yahav. 2018. Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples.” In International Conference on Machine Learning.
Weston, Chopra, and Bordes. 2014. Memory Networks.” arXiv:1410.3916 [Cs, Stat].
Zhan, Xie, Mao, et al. 2022. Evaluating Interpolation and Extrapolation Performance of Neural Retrieval Models.” In Proceedings of the 31st ACM International Conference on Information & Knowledge Management. CIKM ’22.