60
1

Annotating Scientific Uncertainty: A comprehensive model using linguistic patterns and comparison with existing approaches

Abstract

UnScientify, a system designed to detect scientific uncertainty in scholarly full text. The system utilizes a weakly supervised technique to identify verbally expressed uncertainty in scientific texts and their authorial references. The core methodology of UnScientify is based on a multi-faceted pipeline that integrates span pattern matching, complex sentence analysis and author reference checking. This approach streamlines the labeling and annotation processes essential for identifying scientific uncertainty, covering a variety of uncertainty expression types to support diverse applications including information retrieval, text mining and scientific document processing. The evaluation results highlight the trade-offs between modern large language models (LLMs) and the UnScientify system. UnScientify, which employs more traditional techniques, achieved superior performance in the scientific uncertainty detection task, attaining an accuracy score of 0.808. This finding underscores the continued relevance and efficiency of UnScientify's simple rule-based and pattern matching strategy for this specific application. The results demonstrate that in scenarios where resource efficiency, interpretability, and domain-specific adaptability are critical, traditional methods can still offer significant advantages.

View on arXiv
@article{ningrum2025_2503.11376,
  title={ Annotating Scientific Uncertainty: A comprehensive model using linguistic patterns and comparison with existing approaches },
  author={ Panggih Kusuma Ningrum and Philipp Mayr and Nina Smirnova and Iana Atanassova },
  journal={arXiv preprint arXiv:2503.11376},
  year={ 2025 }
}
Comments on this paper