Editing Across Languages: A Survey of Multilingual Knowledge Editing
- KELM

While Knowledge Editing has been extensively studied in monolingual settings, it remains underexplored in multilingual contexts. This survey systematizes recent research on Multilingual Knowledge Editing (MKE), a growing subdomain of model editing focused on ensuring factual edits generalize reliably across languages. We present a comprehensive taxonomy of MKE methods, covering parameter-based, memory-based, fine-tuning, and hypernetwork approaches. We survey available benchmarks,summarize key findings on method effectiveness and transfer patterns, identify challenges in cross-lingual propagation, and highlight open problems related to language anisotropy, evaluation coverage, and edit scalability. Our analysis consolidates a rapidly evolving area and lays the groundwork for future progress in editable language-aware LLMs.
View on arXiv@article{durrani2025_2505.14393, title={ Editing Across Languages: A Survey of Multilingual Knowledge Editing }, author={ Nadir Durrani and Basel Mousi and Fahim Dalvi }, journal={arXiv preprint arXiv:2505.14393}, year={ 2025 } }