Edoardo-BS
commited on
Commit
•
46f59e9
1
Parent(s):
c35f0e1
Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,8 @@ tags:
|
|
7 |
# HuBERT-ECG: A Self-Supervised Foundation Model for Broad and Scalable Cardiac Application
|
8 |
|
9 |
[![medrXiv](https://img.shields.io/badge/Code-green)](https://github.com/Edoar-do/HuBERT-ECG)
|
10 |
-
|
|
|
11 |
|
12 |
## Abstract
|
13 |
Deep learning models have shown remarkable performance in electrocardiogram (ECG) analysis, but their success has been constrained by the limited availability and size of ECG datasets, resulting in systems that are more task specialists than versatile generalists. In this work, we introduce HuBERT-ECG, a foundation ECG model pre-trained in a self-supervised manner on a large and diverse dataset of 9.1 million 12-lead ECGs encompassing 164 cardiovascular conditions. By simply adding an output layer, HuBERT-ECG can be fine-tuned for a wide array of downstream tasks, from diagnosing diseases to predicting future cardiovascular events. Across diverse real-world scenarios, HuBERT-ECG achieves AUROCs from 84.3% in low-data settings to 99% in large-scale setups. When trained to detect 164 overlapping conditions simultaneously, our model delivers AUROCs above 90% and 95% for 140 and 94 diseases, respectively. HuBERT-ECG also predicts death events within a 2-year follow-up with an AUROC of 93.4%. We release models and code.
|
|
|
7 |
# HuBERT-ECG: A Self-Supervised Foundation Model for Broad and Scalable Cardiac Application
|
8 |
|
9 |
[![medrXiv](https://img.shields.io/badge/Code-green)](https://github.com/Edoar-do/HuBERT-ECG)
|
10 |
+
![GitHub License](https://img.shields.io/github/license/:Edoar-do/:HuBERT-ECG)
|
11 |
+
|
12 |
|
13 |
## Abstract
|
14 |
Deep learning models have shown remarkable performance in electrocardiogram (ECG) analysis, but their success has been constrained by the limited availability and size of ECG datasets, resulting in systems that are more task specialists than versatile generalists. In this work, we introduce HuBERT-ECG, a foundation ECG model pre-trained in a self-supervised manner on a large and diverse dataset of 9.1 million 12-lead ECGs encompassing 164 cardiovascular conditions. By simply adding an output layer, HuBERT-ECG can be fine-tuned for a wide array of downstream tasks, from diagnosing diseases to predicting future cardiovascular events. Across diverse real-world scenarios, HuBERT-ECG achieves AUROCs from 84.3% in low-data settings to 99% in large-scale setups. When trained to detect 164 overlapping conditions simultaneously, our model delivers AUROCs above 90% and 95% for 140 and 94 diseases, respectively. HuBERT-ECG also predicts death events within a 2-year follow-up with an AUROC of 93.4%. We release models and code.
|