Manara - Qatar Research Repository
Browse

Rotational Unit of Memory: A Novel Representation Unit for RNNs with Scalable Applications

Download (2.72 MB)
journal contribution
submitted on 2024-05-28, 07:08 and posted on 2024-05-28, 07:08 authored by Rumen Dangovski, Li Jing, Preslav Nakov, Mićo Tatalović, Marin Soljačić

Stacking long short-term memory (LSTM) cells or gated recurrent units (GRUs) as part of a recurrent neural network (RNN) has become a standard approach to solving a number of tasks ranging from language modeling to text summarization. Although LSTMs and GRUs were designed to model long-range dependencies more accurately than conventional RNNs, they nevertheless have problems copying or recalling information from the long distant past. Here, we derive a phase-coded representation of the memory state, Rotational Unit of Memory (RUM), that unifies the concepts of unitary learning and associative memory. We show experimentally that RNNs based on RUMs can solve basic sequential tasks such as memory copying and memory recall much better than LSTMs/GRUs. We further demonstrate that by replacing LSTM/GRU with RUM units we can apply neural networks to real-world problems such as language modeling and text summarization, yielding results comparable to the state of the art.


Other Information

Published in: Transactions of the Association for Computational Linguistics
License: https://creativecommons.org/licenses/by/4.0/
See article on publisher's website: https://dx.doi.org/10.1162/tacl_a_00258

Funding

Open Access funding provided by the Qatar National Library.

History

Language

  • English

Publisher

MIT Press

Publication Year

  • 2019

License statement

This Item is licensed under the Creative Commons Attribution 4.0 International License

Institution affiliated with

  • Hamad Bin Khalifa University
  • Qatar Computing Research Institute - HBKU