ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.18020
64
0

AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages

25 February 2025
Joshua Sakthivel Raju
Shri Kiran Srinivasan
Jaskaran Singh Walia
Srinivas Raghav
Vukosi Marivate
ArXivPDFHTML
Abstract

Language model compression through knowledge distillation has emerged as a promising approach for deploying large language models in resource-constrained environments. However, existing methods often struggle to maintain performance when distilling multilingual models, especially for low-resource languages. In this paper, we present a novel hybrid distillation approach that combines traditional knowledge distillation with a simplified attention matching mechanism, specifically designed for multilingual contexts. Our method introduces an extremely compact student model architecture, significantly smaller than conventional multilingual models. We evaluate our approach on five African languages: Kinyarwanda, Swahili, Hausa, Igbo, and Yoruba. The distilled student model; AfroXLMR-Comet successfully captures both the output distribution and internal attention patterns of a larger teacher model (AfroXLMR-Large) while reducing the model size by over 85%. Experimental results demonstrate that our hybrid approach achieves competitive performance compared to the teacher model, maintaining an accuracy within 85% of the original model's performance while requiring substantially fewer computational resources. Our work provides a practical framework for deploying efficient multilingual models in resource-constrained environments, particularly benefiting applications involving African languages.

View on arXiv
@article{raju2025_2502.18020,
  title={ AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages },
  author={ Joshua Sakthivel Raju and Sanjay S and Jaskaran Singh Walia and Srinivas Raghav and Vukosi Marivate },
  journal={arXiv preprint arXiv:2502.18020},
  year={ 2025 }
}
Comments on this paper