ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.16101
20
0

xLSTM-ECG: Multi-label ECG Classification via Feature Fusion with xLSTM

14 April 2025
Lei Kang
Xuanshuo Fu
Javier Vazquez-Corral
Ernest Valveny
Dimosthenis Karatzas
ArXivPDFHTML
Abstract

Cardiovascular diseases (CVDs) remain the leading cause of mortality worldwide, highlighting the critical need for efficient and accurate diagnostic tools. Electrocardiograms (ECGs) are indispensable in diagnosing various heart conditions; however, their manual interpretation is time-consuming and error-prone. In this paper, we propose xLSTM-ECG, a novel approach that leverages an extended Long Short-Term Memory (xLSTM) network for multi-label classification of ECG signals, using the PTB-XL dataset. To the best of our knowledge, this work represents the first design and application of xLSTM modules specifically adapted for multi-label ECG classification. Our method employs a Short-Time Fourier Transform (STFT) to convert time-series ECG waveforms into the frequency domain, thereby enhancing feature extraction. The xLSTM architecture is specifically tailored to address the complexities of 12-lead ECG recordings by capturing both local and global signal features. Comprehensive experiments on the PTB-XL dataset reveal that our model achieves strong multi-label classification performance, while additional tests on the Georgia 12-Lead dataset underscore its robustness and efficiency. This approach significantly improves ECG classification accuracy, thereby advancing clinical diagnostics and patient care. The code will be publicly available upon acceptance.

View on arXiv
@article{kang2025_2504.16101,
  title={ xLSTM-ECG: Multi-label ECG Classification via Feature Fusion with xLSTM },
  author={ Lei Kang and Xuanshuo Fu and Javier Vazquez-Corral and Ernest Valveny and Dimosthenis Karatzas },
  journal={arXiv preprint arXiv:2504.16101},
  year={ 2025 }
}
Comments on this paper