30
1

RTHDet: Rotate Table Area and Head Detection in images

Abstract

Traditional models focus on horizontal table detection but struggle in rotating contexts, limiting progress in table recognition. This paper introduces a new task: detecting table regions and localizing head-tail parts in rotation scenarios. We propose corresponding datasets, evaluation metrics, and methods. Our novel method, Ádaptively Bounded Rotation,' addresses dataset scarcity in detecting rotated tables and their head-tail parts. We produced 'TRR360D,' a dataset incorporating semantic information of table head and tail, based on ÍCDAR2019MTD.' A new metric, 'R360 AP,' measures precision in detecting rotated regions and localizing head-tail parts. Our baseline, the high-speed and accurate 'RTMDet-S,' is chosen after extensive review and testing. We introduce 'RTHDet,' enhancing the baseline with a 'r360' rotated rectangle angle representation and an Ángle Loss' branch, improving head-tail localization. By applying transfer learning and adaptive boundary rotation augmentation, RTHDet's AP50 (T<90) improved from 23.7% to 88.7% compared to the baseline. This demonstrates RTHDet's effectiveness in detecting rotating table regions and accurately localizing head and tail parts.RTHDet is integrated into the widely-used open-source MMRotate toolkit: https://github.com/open-mmlab/mmrotate/tree/dev-1.x/projects/RR360.

View on arXiv
Comments on this paper