ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.03962
41
0

On the Acquisition of Shared Grammatical Representations in Bilingual Language Models

5 March 2025
Catherine Arnett
Tyler A. Chang
J. Michaelov
Benjamin Bergen
ArXivPDFHTML
Abstract

While crosslingual transfer is crucial to contemporary language models' multilingual capabilities, how it occurs is not well understood. In this paper, we ask what happens to a monolingual language model when it begins to be trained on a second language. Specifically, we train small bilingual models for which we control the amount of data for each language and the order of language exposure. To find evidence of shared multilingual representations, we turn to structural priming, a method used to study grammatical representations in humans. We first replicate previous crosslingual structural priming results and find that after controlling for training data quantity and language exposure, there are asymmetrical effects across language pairs and directions. We argue that this asymmetry may shape hypotheses about human structural priming effects. We also find that structural priming effects are less robust for less similar language pairs, highlighting potential limitations of crosslingual transfer learning and shared representations for typologically diverse languages.

View on arXiv
@article{arnett2025_2503.03962,
  title={ On the Acquisition of Shared Grammatical Representations in Bilingual Language Models },
  author={ Catherine Arnett and Tyler A. Chang and James A. Michaelov and Benjamin K. Bergen },
  journal={arXiv preprint arXiv:2503.03962},
  year={ 2025 }
}
Comments on this paper