19
0

Computational Fact-Checking of Online Discourse: Scoring scientific accuracy in climate change related news articles

Abstract

Democratic societies need reliable information. Misinformation in popular media such as news articles or videos threatens to impair civic discourse. Citizens are, unfortunately, not equipped to verify this content flood consumed daily at increasing rates. This work aims to semi-automatically quantify scientific accuracy of online media. By semantifying media of unknown veracity, their statements can be compared against equally processed trusted sources. We implemented a workflow using LLM-based statement extraction and knowledge graph analysis. Our neurosymbolic system was able to evidently streamline state-of-the-art veracity quantification. Evaluated via expert interviews and a user survey, the tool provides a beneficial veracity indication. This indicator, however, is unable to annotate public media at the required granularity and scale. Further work towards a FAIR (Findable, Accessible, Interoperable, Reusable) ground truth and complementary metrics are required to scientifically support civic discourse.

View on arXiv
@article{wittenborg2025_2505.07409,
  title={ Computational Fact-Checking of Online Discourse: Scoring scientific accuracy in climate change related news articles },
  author={ Tim Wittenborg and Constantin Sebastian Tremel and Markus Stocker and Sören Auer },
  journal={arXiv preprint arXiv:2505.07409},
  year={ 2025 }
}
Comments on this paper