Memory in machine learning


How best should learning mechanisms store and retrieve memories? Important in reinforcement learnt Implicit in recurrent networks. One of the chief advantages of neural Turing machines. A great apparent success of transformers.

But, as my colleague Tom Blau points out, perhaps best considered as a topic in its owen right.

References

Charles, Adam, Dong Yin, and Christopher Rozell. 2016. “Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks.” May 26, 2016. http://arxiv.org/abs/1605.08346.
Gordo, Albert, Jon Almazan, Jerome Revaud, and Diane Larlus. 2016. “End-to-End Learning of Deep Visual Representations for Image Retrieval.” October 25, 2016. http://arxiv.org/abs/1610.07940.
Graves, Alex, Greg Wayne, Malcolm Reynolds, Tim Harley, Ivo Danihelka, Agnieszka Grabska-Barwińska, Sergio Gómez Colmenarejo, et al. 2016. “Hybrid Computing Using a Neural Network with Dynamic External Memory.” Nature advance online publication (October). https://doi.org/10.1038/nature20101.
Grefenstette, Edward, Karl Moritz Hermann, Mustafa Suleyman, and Phil Blunsom. 2015. “Learning to Transduce with Unbounded Memory.” June 8, 2015. http://arxiv.org/abs/1506.02516.
Hochreiter, Sepp, and Jürgen Schmidhuber. 1997. “Long Short-Term Memory.” Neural Computation 9 (8): 1735–80. https://doi.org/10.1162/neco.1997.9.8.1735.
Munkhdalai, Tsendsuren, Alessandro Sordoni, Tong Wang, and Adam Trischler. 2019. “Metalearned Neural Memory.” In Advances In Neural Information Processing Systems. http://arxiv.org/abs/1907.09720.
Nagathan, Arvind, Jitendranath Mungara, and Manimozhi. 2014. “Content-Based Image Retrieval System Using Feed-Forward Backpropagation Neural Network.” International Journal of Computer Science and Network Security (IJCSNS) 14 (6): 70. http://paper.ijcsns.org/07_book/html/201406/201406013.html.
Patraucean, Viorica, Ankur Handa, and Roberto Cipolla. 2015. “Spatio-Temporal Video Autoencoder with Differentiable Memory.” November 19, 2015. http://arxiv.org/abs/1511.06309.
Perez, Julien, and Fei Liu. 2016. “Gated End-to-End Memory Networks.” October 13, 2016. http://arxiv.org/abs/1610.04211.
Voelker, Aaron R, Ivana Kajic, and Chris Eliasmith. n.d. “Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks,” 10.
Weiss, Gail, Yoav Goldberg, and Eran Yahav. 2018. “Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples.” In International Conference on Machine Learning, 5247–56. PMLR. http://proceedings.mlr.press/v80/weiss18a.html.
Weston, Jason, Sumit Chopra, and Antoine Bordes. 2014. “Memory Networks.” October 14, 2014. http://arxiv.org/abs/1410.3916.

Warning! Experimental comments system! If is does not work for you, let me know via the contact form.

No comments yet!

GitHub-flavored Markdown & a sane subset of HTML is supported.