7
0

Relaxed syntax modeling in Transformers for future-proof license plate recognition

Main:15 Pages
4 Figures
Bibliography:4 Pages
5 Tables
Abstract

Effective license plate recognition systems are required to be resilient to constant change, as new license plates are released into traffic daily. While Transformer-based networks excel in their recognition at first sight, we observe significant performance drop over time which proves them unsuitable for tense production environments. Indeed, such systems obtain state-of-the-art results on plates whose syntax is seen during training. Yet, we show they perform similarly to random guessing on future plates where legible characters are wrongly recognized due to a shift in their syntax. After highlighting the flows of positional and contextual information in Transformer encoder-decoders, we identify several causes for their over-reliance on past syntax. Following, we devise architectural cut-offs and replacements which we integrate into SaLT, an attempt at a Syntax-Less Transformer for syntax-agnostic modeling of license plate representations. Experiments on both real and synthetic datasets show that our approach reaches top accuracy on past syntax and most importantly nearly maintains performance on future license plates. We further demonstrate the robustness of our architecture enhancements by way of various ablations.

View on arXiv
@article{meyer2025_2506.17051,
  title={ Relaxed syntax modeling in Transformers for future-proof license plate recognition },
  author={ Florent Meyer and Laurent Guichard and Denis Coquenet and Guillaume Gravier and Yann Soullard and Bertrand Coüasnon },
  journal={arXiv preprint arXiv:2506.17051},
  year={ 2025 }
}
Comments on this paper