ECG-Mamba2: MambaLRP Explainability and Noise Handling Comparison
A comprehensive implementation comparing three noise handling methods with MambaLRP explainability for ECG classification using Mamba-2 architecture on the PTB-XL dataset.
Overview
This project provides three main components for ECG signal classification research:
Part 1: Architecture Comparison
Compares two Mamba-2 based architectures for ECG classification:
- Baseline ECG-Mamba (Mamba-2): Standard Mamba-2 architecture with unidirectional SSM
- Improved ECG-Mamba (Mamba-2): Enhanced version with bidirectional SSM, multi-branch processing, and attention mechanisms
Part 2: Noise Handling Comparison
Evaluates three noise handling strategies using the Baseline ECG-Mamba architecture:
- Non-Uniform-Mix: Conservative data augmentation approach
- Contrastive Learning + Masking: Self-supervised learning with masked signal reconstruction
- Adversarial Training + Frequency Masking: Robust training with frequency-domain augmentation
Part 3: Explainability with MambaLRP
Implements Layer-wise Relevance Propagation (LRP) adapted for Mamba architecture to visualize which parts of ECG signals contribute most to model predictions.
Dataset
This project uses the PTB-XL dataset from PhysioNet:
- 12-lead ECG recordings
- Multi-label diagnostic classification
- 5 diagnostic superclasses: NORM, MI, STTC, CD, HYP
Requirements
torch
mamba-ssm>=2.0.0
causal-conv1d>=1.2.0
wfdb
pandas
numpy
scikit-learn
matplotlib
tqdm
Note: mamba-ssm requires CUDA. An LSTM fallback is provided for CPU-only environments.
Installation
pip install torch
pip install causal-conv1d>=1.2.0
pip install mamba-ssm
pip install wfdb pandas numpy scikit-learn matplotlib tqdm
Model Architectures
Baseline ECG-Mamba (Mamba-2)
- Patch embedding for ECG signals
- Stacked Mamba-2 blocks with State Space Duality (SSD)
- Unidirectional sequence modeling
- Global average pooling + classification head
Improved ECG-Mamba (Mamba-2)
- Multi-scale patch embedding
- Bidirectional Mamba-2 processing (forward + backward)
- Multi-branch feature extraction
- Cross-attention fusion mechanism
- Enhanced classification head with dropout
Usage
The main notebook ECG_Mamba2_Architecture_Comparison.ipynb contains:
- Data downloading and preprocessing
- Model definitions for all architectures
- Training loops with evaluation metrics
- Noise handling implementations
- MambaLRP explainability visualizations
Results
The notebook compares:
- Classification accuracy across different architectures
- Noise robustness with various augmentation strategies
- Interpretability through LRP heatmaps
Citation
If you use this code, please cite:
@misc{ecg-mamba2-mambalrp-noise,
title={ECG-Mamba2: MambaLRP Explainability and Noise Handling Comparison},
year={2024},
url={https://github.com/skkuhg/ecg-mamba2-mambalrp-noise-comparison}
}
References
- Mamba: Linear-Time Sequence Modeling with Selective State Spaces
- Mamba-2: Transformers are SSMs
- PTB-XL Dataset
- MambaLRP: Layer-wise Relevance Propagation for Mamba
License
This project is licensed under the MIT License - see the LICENSE file for details.