Papers
arxiv:2408.05178

ECG-FM: An Open Electrocardiogram Foundation Model

Published on Aug 9, 2024
Authors:
,
,
,
,

Abstract

ECG-FM, a transformer-based foundation model pretrained with a hybrid contrastive and generative approach, outperforms task-specific models in ECG analysis with small-to-medium datasets and demonstrates cross-dataset generalizability.

AI-generated summary

Conventional task-specific electrocardiogram (ECG) analysis models require large annotated datasets to train. Foundation models mitigate this burden by leveraging self-supervised pretraining; however, the scarcity of open-weight ECG foundation models hinders adoption and cross-study comparability. We present ECG-FM, an open foundation model for ECG analysis, and conduct a study using a dataset of 1.5 million ECGs. ECG-FM is a transformer-based model pretrained using a hybrid contrastive and generative self-supervised learning approach. Our downstream tasks include predicting reduced left ventricular ejection fraction (LVEF) and ECG interpretation labels, where we release a benchmark task on the MIMIC-IV-ECG dataset. We affirm that ECG-FM is robust, label-efficient, and functionally discriminative by showcasing data scaling experiments, performing a latent space analysis, and generating saliency maps. ECG-FM markedly outperforms task-specific models in the small-to-medium-scale data regime and demonstrates cross-dataset generalizability, achieving high AUROC on many clinically salient labels such as atrial fibrillation (0.996) and LVEF<=40% (0.929). We release our code, model weights, and benchmark task at https://github.com/bowang-lab/ECG-FM/.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2408.05178 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2408.05178 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2408.05178 in a Space README.md to link it from this page.

Collections including this paper 1