ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.01724
56
0

Syntactic Learnability of Echo State Neural Language Models at Scale

3 March 2025
Ryo Ueda
Tatsuki Kuribayashi
Shunsuke Kando
Kentaro Inui
ArXivPDFHTML
Abstract

What is a neural model with minimum architectural complexity that exhibits reasonable language learning capability? To explore such a simple but sufficient neural language model, we revisit a basic reservoir computing (RC) model, Echo State Network (ESN), a restricted class of simple Recurrent Neural Networks. Our experiments showed that ESN with a large hidden state is comparable or superior to Transformer in grammaticality judgment tasks when trained with about 100M words, suggesting that architectures as complex as that of Transformer may not always be necessary for syntactic learning.

View on arXiv
@article{ueda2025_2503.01724,
  title={ Syntactic Learnability of Echo State Neural Language Models at Scale },
  author={ Ryo Ueda and Tatsuki Kuribayashi and Shunsuke Kando and Kentaro Inui },
  journal={arXiv preprint arXiv:2503.01724},
  year={ 2025 }
}
Comments on this paper