13
0

Cross-Linguistic Transfer in Multilingual NLP: The Role of Language Families and Morphology

Main:10 Pages
7 Figures
Abstract

Cross-lingual transfer has become a crucial aspect of multilingual NLP, as it allows for models trained on resource-rich languages to be applied to low-resource languages more effectively. Recently massively multilingual pre-trained language models (e.g., mBERT, XLM-R) demonstrate strong zero-shot transfer capabilities[14] [13]. This paper investigates cross-linguistic transfer through the lens of language families and morphology. Investigating how language family proximity and morphological similarity affect performance across NLP tasks. We further discuss our results and how it relates to findings from recent literature. Overall, we compare multilingual model performance and review how linguistic distance metrics correlate with transfer outcomes. We also look into emerging approaches that integrate typological and morphological information into model pre-training to improve transfer to diverse languages[18] [19].

View on arXiv
@article{bankula2025_2505.13908,
  title={ Cross-Linguistic Transfer in Multilingual NLP: The Role of Language Families and Morphology },
  author={ Ajitesh Bankula and Praney Bankula },
  journal={arXiv preprint arXiv:2505.13908},
  year={ 2025 }
}
Comments on this paper