ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.05178
49
19

ECG-FM: An Open Electrocardiogram Foundation Model

9 August 2024
Kaden McKeen
Laura Oliva
Augustin Toma
Barry Rubin
Barry Rubin
ArXivPDFHTML
Abstract

Conventional task-specific electrocardiogram (ECG) analysis models require large annotated datasets to train. Foundation models mitigate this burden by leveraging self-supervised pretraining; however, the scarcity of open-weight ECG foundation models hinders adoption and cross-study comparability. We present ECG-FM, an open foundation model for ECG analysis, and conduct a study using a dataset of 1.5 million ECGs. ECG-FM is a transformer-based model pretrained using a hybrid contrastive and generative self-supervised learning approach. Our downstream tasks include predicting reduced left ventricular ejection fraction (LVEF) and ECG interpretation labels, where we release a benchmark task on the MIMIC-IV-ECG dataset. We affirm that ECG-FM is robust, label-efficient, and functionally discriminative by showcasing data scaling experiments, performing a latent space analysis, and generating saliency maps. ECG-FM markedly outperforms task-specific models in the small-to-medium-scale data regime and demonstrates cross-dataset generalizability, achieving high AUROC on many clinically salient labels such as atrial fibrillation (0.996) and LVEF<=40% (0.929). We release our code, model weights, and benchmark task atthis https URL.

View on arXiv
@article{mckeen2025_2408.05178,
  title={ ECG-FM: An Open Electrocardiogram Foundation Model },
  author={ Kaden McKeen and Sameer Masood and Augustin Toma and Barry Rubin and Bo Wang },
  journal={arXiv preprint arXiv:2408.05178},
  year={ 2025 }
}
Comments on this paper