Comprehensive Evaluation on Lexical Normalization: Boundary-Aware Approaches for Unsegmented Languages

Main:11 Pages
6 Figures
Bibliography:2 Pages
19 Tables
Appendix:10 Pages
Abstract
Lexical normalization research has sought to tackle the challenge of processing informal expressions in user-generated text, yet the absence of comprehensive evaluations leaves it unclear which methods excel across multiple perspectives. Focusing on unsegmented languages, we make three key contributions: (1) creating a large-scale, multi-domain Japanese normalization dataset, (2) developing normalization methods based on state-of-the-art pretrained models, and (3) conducting experiments across multiple evaluation perspectives. Our experiments show that both encoder-only and decoder-only approaches achieve promising results in both accuracy and efficiency.
View on arXiv@article{higashiyama2025_2505.22273, title={ Comprehensive Evaluation on Lexical Normalization: Boundary-Aware Approaches for Unsegmented Languages }, author={ Shohei Higashiyama and Masao Utiyama }, journal={arXiv preprint arXiv:2505.22273}, year={ 2025 } }
Comments on this paper