How best should learning mechanisms store and retrieve memories? Important in reinforcement learnt Implicit in recurrent networks. One of the chief advantages of neural Turing machines. A great apparent success of transformers.
But, as my colleague Tom Blau points out, perhaps best considered as a topic in its owen right.
References
Charles, Adam, Dong Yin, and Christopher Rozell. 2016. “Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks.” arXiv:1605.08346 [Cs, Math, Stat], May.
Gordo, Albert, Jon Almazan, Jerome Revaud, and Diane Larlus. 2016. “End-to-End Learning of Deep Visual Representations for Image Retrieval.” arXiv:1610.07940 [Cs], October.
Graves, Alex, Greg Wayne, Malcolm Reynolds, Tim Harley, Ivo Danihelka, Agnieszka Grabska-Barwińska, Sergio Gómez Colmenarejo, et al. 2016. “Hybrid Computing Using a Neural Network with Dynamic External Memory.” Nature advance online publication (October).
Grefenstette, Edward, Karl Moritz Hermann, Mustafa Suleyman, and Phil Blunsom. 2015. “Learning to Transduce with Unbounded Memory.” arXiv:1506.02516 [Cs], June.
Hochreiter, Sepp, and Jürgen Schmidhuber. 1997. “Long Short-Term Memory.” Neural Computation 9 (8): 1735–80.
Munkhdalai, Tsendsuren, Alessandro Sordoni, Tong Wang, and Adam Trischler. 2019. “Metalearned Neural Memory.” In Advances In Neural Information Processing Systems.
Nagathan, Arvind, Jitendranath Mungara, and Manimozhi. 2014. “Content-Based Image Retrieval System Using Feed-Forward Backpropagation Neural Network.” International Journal of Computer Science and Network Security (IJCSNS) 14 (6): 70.
Patraucean, Viorica, Ankur Handa, and Roberto Cipolla. 2015. “Spatio-Temporal Video Autoencoder with Differentiable Memory.” arXiv:1511.06309 [Cs], November.
Perez, Julien, and Fei Liu. 2016. “Gated End-to-End Memory Networks.” arXiv:1610.04211 [Cs, Stat], October.
Voelker, Aaron R, Ivana Kajic, and Chris Eliasmith. n.d. “Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks,” 10.
Weiss, Gail, Yoav Goldberg, and Eran Yahav. 2018. “Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples.” In International Conference on Machine Learning, 5247–56. PMLR.
Weston, Jason, Sumit Chopra, and Antoine Bordes. 2014. “Memory Networks.” arXiv:1410.3916 [Cs, Stat], October.
Zhan, Jingtao, Xiaohui Xie, Jiaxin Mao, Yiqun Liu, Jiafeng Guo, Min Zhang, and Shaoping Ma. 2022. “Evaluating Interpolation and Extrapolation Performance of Neural Retrieval Models.” In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, 2486–96. CIKM ’22. New York, NY, USA: Association for Computing Machinery.
No comments yet. Why not leave one?