49
0

XToM: Exploring the Multilingual Theory of Mind for Large Language Models

Main:9 Pages
22 Figures
Bibliography:8 Pages
20 Tables
Appendix:19 Pages
Abstract

Theory of Mind (ToM), the ability to infer mental states in others, is pivotal for human social cognition. Existing evaluations of ToM in LLMs are largely limited to English, neglecting the linguistic diversity that shapes human cognition. This limitation raises a critical question: can LLMs exhibit Multilingual Theory of Mind, which is the capacity to reason about mental states across diverse linguistic contexts? To address this gap, we present XToM, a rigorously validated multilingual benchmark that evaluates ToM across five languages and incorporates diverse, contextually rich task scenarios. Using XToM, we systematically evaluate LLMs (e.g., DeepSeek R1), revealing a pronounced dissonance: while models excel in multilingual language understanding, their ToM performance varies across languages. Our findings expose limitations in LLMs' ability to replicate human-like mentalizing across linguistic contexts.

View on arXiv
@article{chan2025_2506.02461,
  title={ XToM: Exploring the Multilingual Theory of Mind for Large Language Models },
  author={ Chunkit Chan and Yauwai Yim and Hongchuan Zeng and Zhiying Zou and Xinyuan Cheng and Zhifan Sun and Zheye Deng and Kawai Chung and Yuzhuo Ao and Yixiang Fan and Cheng Jiayang and Ercong Nie and Ginny Y. Wong and Helmut Schmid and Hinrich Schütze and Simon See and Yangqiu Song },
  journal={arXiv preprint arXiv:2506.02461},
  year={ 2025 }
}
Comments on this paper