95
3

Associative Recurrent Memory Transformer

Abstract

This paper addresses the challenge of creating a neural architecture for very long sequences that requires constant time for processing new information at each time step. Our approach, Associative Recurrent Memory Transformer (ARMT), is based on transformer self-attention for local context and segment-level recurrence for storage of task specific information distributed over a long context. We demonstrate that ARMT outperfors existing alternatives in associative retrieval tasks and sets a new performance record in the recent BABILong multi-task long-context benchmark by answering single-fact questions over 50 million tokens with an accuracy of 79.9%. The source code for training and evaluation is available on github.

View on arXiv
@article{rodkin2025_2407.04841,
  title={ Associative Recurrent Memory Transformer },
  author={ Ivan Rodkin and Yuri Kuratov and Aydar Bulatov and Mikhail Burtsev },
  journal={arXiv preprint arXiv:2407.04841},
  year={ 2025 }
}
Comments on this paper

We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from. See our policy.