ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.14704
7
0

Capacity Matters: a Proof-of-Concept for Transformer Memorization on Real-World Data

17 June 2025
Anton Changalidis
Aki Härmä
ArXiv (abs)PDFHTML
Main:8 Pages
7 Figures
Bibliography:2 Pages
4 Tables
Appendix:2 Pages
Abstract

This paper studies how the model architecture and data configurations influence the empirical memorization capacity of generative transformers. The models are trained using synthetic text datasets derived from the Systematized Nomenclature of Medicine (SNOMED) knowledge graph: triplets, representing static connections, and sequences, simulating complex relation patterns. The results show that embedding size is the primary determinant of learning speed and capacity, while additional layers provide limited benefits and may hinder performance on simpler datasets. Activation functions play a crucial role, and Softmax demonstrates greater stability and capacity. Furthermore, increasing the complexity of the data set seems to improve the final memorization. These insights improve our understanding of transformer memory mechanisms and provide a framework for optimizing model design with structured real-world data.

View on arXiv
@article{changalidis2025_2506.14704,
  title={ Capacity Matters: a Proof-of-Concept for Transformer Memorization on Real-World Data },
  author={ Anton Changalidis and Aki Härmä },
  journal={arXiv preprint arXiv:2506.14704},
  year={ 2025 }
}
Comments on this paper