45
1

Resona: Improving Context Copying in Linear Recurrence Models with Retrieval

Abstract

Recent shifts in the space of large language model (LLM) research have shown an increasing focus on novel architectures to compete with prototypical Transformer-based models that have long dominated this space. Linear recurrent models have proven to be a viable competitor due to their computational efficiency. However, such models still demonstrate a sizable gap compared to Transformers in terms of in-context learning among other tasks that require recalling information from a context. In this work, we introduce __Resona__, a simple and scalable framework for augmenting linear recurrent models with retrieval. __Resona__~augments models with the ability to integrate retrieved information from the provided input context, enabling tailored behavior to diverse task requirements. Experiments on a variety of linear recurrent models demonstrate that __Resona__-augmented models observe significant performance gains on a variety of synthetic as well as real-world natural language tasks, highlighting its ability to act as a general purpose method to improve the in-context learning and language modeling abilities of linear recurrent LLMs.

View on arXiv
@article{wang2025_2503.22913,
  title={ Resona: Improving Context Copying in Linear Recurrence Models with Retrieval },
  author={ Xinyu Wang and Linrui Ma and Jerry Huang and Peng Lu and Prasanna Parthasarathi and Xiao-Wen Chang and Boxing Chen and Yufei Cui },
  journal={arXiv preprint arXiv:2503.22913},
  year={ 2025 }
}
Comments on this paper