ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2108.11809
11
3

Fine-Tuning Pretrained Language Models With Label Attention for Biomedical Text Classification

26 August 2021
Bruce Nguyen
Shaoxiong Ji
    MedIm
ArXivPDFHTML
Abstract

The massive scale and growth of textual biomedical data have made its indexing and classification increasingly important. However, existing research on this topic mainly utilized convolutional and recurrent neural networks, which generally achieve inferior performance than the novel transformers. On the other hand, systems that apply transformers only focus on the target documents, overlooking the rich semantic information that label descriptions contain. To address this gap, we develop a transformer-based biomedical text classifier that considers label information. The system achieves this with a label attention module incorporated into the fine-tuning process of pretrained language models (PTMs). Our results on two public medical datasets show that the proposed fine-tuning scheme outperforms the vanilla PTMs and state-of-the-art models.

View on arXiv
Comments on this paper