ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.16312
52
0

Iterative Auto-Annotation for Scientific Named Entity Recognition Using BERT-Based Models

22 February 2025
Kartik Gupta
ArXivPDFHTML
Abstract

This paper presents an iterative approach to performing Scientific Named Entity Recognition (SciNER) using BERT-based models. We leverage transfer learning to fine-tune pretrained models with a small but high-quality set of manually annotated data. The process is iteratively refined by using the fine-tuned model to auto-annotate a larger dataset, followed by additional rounds of fine-tuning. We evaluated two models, dslim/bert-large-NER and bert-largecased, and found that bert-large-cased consistently outperformed the former. Our approach demonstrated significant improvements in prediction accuracy and F1 scores, especially for less common entity classes. Future work could include pertaining with unlabeled data, exploring more powerful encoders like RoBERTa, and expanding the scope of manual annotations. This methodology has broader applications in NLP tasks where access to labeled data is limited.

View on arXiv
@article{gupta2025_2502.16312,
  title={ Iterative Auto-Annotation for Scientific Named Entity Recognition Using BERT-Based Models },
  author={ Kartik Gupta },
  journal={arXiv preprint arXiv:2502.16312},
  year={ 2025 }
}
Comments on this paper