ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.02519
36
14

BERT for Long Documents: A Case Study of Automated ICD Coding

4 November 2022
Arash Afkanpour
Shabir Adeel
H. Bassani
Arkady Epshteyn
Hongbo Fan
Isaac Jones
Mahan Malihi
Adrian Nauth
Raj Sinha
Sanjana Woonna
S. Zamani
Elli Kanal
M. Fomitchev
Donny Cheung
ArXivPDFHTML
Abstract

Transformer models have achieved great success across many NLP problems. However, previous studies in automated ICD coding concluded that these models fail to outperform some of the earlier solutions such as CNN-based models. In this paper we challenge this conclusion. We present a simple and scalable method to process long text with the existing transformer models such as BERT. We show that this method significantly improves the previous results reported for transformer models in ICD coding, and is able to outperform one of the prominent CNN-based methods.

View on arXiv
Comments on this paper