TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/A Deep Knowledge Distillation framework for EEG assisted e...

A Deep Knowledge Distillation framework for EEG assisted enhancement of single-lead ECG based sleep staging

Vaibhav Joshi, Sricharan Vijayarangan, Preejith SP, Mohanasankar Sivaprakasam

2021-12-14Sleep StagingW-R-N Sleep StagingSleep Stage DetectionElectroencephalogram (EEG)Knowledge DistillationEEGW-R-L-D Sleep Staging
PaperPDFCode(official)

Abstract

Automatic Sleep Staging study is presently done with the help of Electroencephalogram (EEG) signals. Recently, Deep Learning (DL) based approaches have enabled significant progress in this area, allowing for near-human accuracy in automated sleep staging. However, EEG based sleep staging requires an extensive as well as an expensive clinical setup. Moreover, the requirement of an expert for setup and the added inconvenience to the subject under study renders it unfavourable in a point of care context. Electrocardiogram (ECG), an unobtrusive alternative to EEG, is more suitable, but its performance, unsurprisingly, remains sub-par compared to EEG-based sleep staging. Naturally, it would be helpful to transfer knowledge from EEG to ECG, ultimately enhancing the model's performance on ECG based inputs. Knowledge Distillation (KD) is a renowned concept in DL that looks to transfer knowledge from a better but potentially more cumbersome teacher model to a compact student model. Building on this concept, we propose a cross-modal KD framework to improve ECG-based sleep staging performance with assistance from features learned through models trained on EEG. Additionally, we also conducted multiple experiments on the individual components of the proposed model to get better insight into the distillation approach. Data of 200 subjects from the Montreal Archive of Sleep Studies (MASS) was utilized for our study. The proposed model showed a 14.3\% and 13.4\% increase in weighted-F1-score in 4-class and 3-class sleep staging, respectively. This demonstrates the viability of KD for performance improvement of single-channel ECG based sleep staging in 4-class(W-L-D-R) and 3-class(W-N-R) classification.

Results

TaskDatasetMetricValueModel
Sleep QualityMontreal Archive of Sleep StudiesWeighted F10.85EEG 1-ch
Sleep QualityMontreal Archive of Sleep StudiesWeighted F10.51ECG 1-ch after KD
Sleep QualityMontreal Archive of Sleep StudiesWeighted F10.9EEG 1-ch
Sleep QualityMontreal Archive of Sleep StudiesWeighted F10.66ECG 1-ch after KD
Sleep StagingMontreal Archive of Sleep StudiesWeighted F10.85EEG 1-ch
Sleep StagingMontreal Archive of Sleep StudiesWeighted F10.51ECG 1-ch after KD
Sleep StagingMontreal Archive of Sleep StudiesWeighted F10.9EEG 1-ch
Sleep StagingMontreal Archive of Sleep StudiesWeighted F10.66ECG 1-ch after KD
Sleep Stage DetectionMontreal Archive of Sleep StudiesWeighted F10.85EEG 1-ch
Sleep Stage DetectionMontreal Archive of Sleep StudiesWeighted F10.51ECG 1-ch after KD
Sleep Stage DetectionMontreal Archive of Sleep StudiesWeighted F10.9EEG 1-ch
Sleep Stage DetectionMontreal Archive of Sleep StudiesWeighted F10.66ECG 1-ch after KD

Related Papers

NeuroXAI: Adaptive, robust, explainable surrogate framework for determination of channel importance in EEG application2025-09-12Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21Uncertainty-Aware Cross-Modal Knowledge Distillation with Prototype Learning for Multimodal Brain-Computer Interfaces2025-07-17Aligning Humans and Robots via Reinforcement Learning from Implicit Human Feedback2025-07-17AFPM: Alignment-based Frame Patch Modeling for Cross-Dataset EEG Decoding2025-07-16DVFL-Net: A Lightweight Distilled Video Focal Modulation Network for Spatio-Temporal Action Recognition2025-07-16HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training2025-07-15CATVis: Context-Aware Thought Visualization2025-07-15