ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.22945
8
0

OWL: Probing Cross-Lingual Recall of Memorized Texts via World Literature

28 May 2025
Alisha Srivastava
Emir Korukluoglu
Minh Nhat Le
Duyen Tran
Chau Minh Pham
Marzena Karpinska
Mohit Iyyer
ArXivPDFHTML
Abstract

Large language models (LLMs) are known to memorize and recall English text from their pretraining data. However, the extent to which this ability generalizes to non-English languages or transfers across languages remains unclear. This paper investigates multilingual and cross-lingual memorization in LLMs, probing if memorized content in one language (e.g., English) can be recalled when presented in translation. To do so, we introduce OWL, a dataset of 31.5K aligned excerpts from 20 books in ten languages, including English originals, official translations (Vietnamese, Spanish, Turkish), and new translations in six low-resource languages (Sesotho, Yoruba, Maithili, Malagasy, Setswana, Tahitian). We evaluate memorization across model families and sizes through three tasks: (1) direct probing, which asks the model to identify a book's title and author; (2) name cloze, which requires predicting masked character names; and (3) prefix probing, which involves generating continuations. We find that LLMs consistently recall content across languages, even for texts without direct translation in pretraining data. GPT-4o, for example, identifies authors and titles 69% of the time and masked entities 6% of the time in newly translated excerpts. Perturbations (e.g., masking characters, shuffling words) modestly reduce direct probing accuracy (7% drop for shuffled official translations). Our results highlight the extent of cross-lingual memorization and provide insights on the differences between the models.

View on arXiv
@article{srivastava2025_2505.22945,
  title={ OWL: Probing Cross-Lingual Recall of Memorized Texts via World Literature },
  author={ Alisha Srivastava and Emir Korukluoglu and Minh Nhat Le and Duyen Tran and Chau Minh Pham and Marzena Karpinska and Mohit Iyyer },
  journal={arXiv preprint arXiv:2505.22945},
  year={ 2025 }
}
Comments on this paper