ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.06931
31
0

The advantages of context specific language models: the case of the Erasmian Language Model

13 August 2024
João Gonçalves
Nick Jelicic
Michele Murgia
Evert Stamhuis
ArXivPDFHTML
Abstract

The current trend to improve language model performance seems to be based on scaling up with the number of parameters (e.g. the state of the art GPT4 model has approximately 1.7 trillion parameters) or the amount of training data fed into the model. However this comes at significant costs in terms of computational resources and energy costs that compromise the sustainability of AI solutions, as well as risk relating to privacy and misuse. In this paper we present the Erasmian Language Model (ELM) a small context specific, 900 million parameter model, pre-trained and fine-tuned by and for Erasmus University Rotterdam. We show how the model performs adequately in a classroom context for essay writing, and how it achieves superior performance in subjects that are part of its context. This has implications for a wide range of institutions and organizations, showing that context specific language models may be a viable alternative for resource constrained, privacy sensitive use cases.

View on arXiv
@article{gonçalves2025_2408.06931,
  title={ The advantages of context specific language models: the case of the Erasmian Language Model },
  author={ João Gonçalves and Nick Jelicic and Michele Murgia and Evert Stamhuis },
  journal={arXiv preprint arXiv:2408.06931},
  year={ 2025 }
}
Comments on this paper