TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Virtual Fusion with Contrastive Learning for Single Sensor...

Virtual Fusion with Contrastive Learning for Single Sensor-based Activity Recognition

Duc-Anh Nguyen, Cuong Pham, Nhien-An Le-Khac

2023-12-01Sensor FusionHuman Activity RecognitionContrastive LearningTime Series ClassificationActivity Recognition
PaperPDF

Abstract

Various types of sensors can be used for Human Activity Recognition (HAR), and each of them has different strengths and weaknesses. Sometimes a single sensor cannot fully observe the user's motions from its perspective, which causes wrong predictions. While sensor fusion provides more information for HAR, it comes with many inherent drawbacks like user privacy and acceptance, costly set-up, operation, and maintenance. To deal with this problem, we propose Virtual Fusion - a new method that takes advantage of unlabeled data from multiple time-synchronized sensors during training, but only needs one sensor for inference. Contrastive learning is adopted to exploit the correlation among sensors. Virtual Fusion gives significantly better accuracy than training with the same single sensor, and in some cases, it even surpasses actual fusion using multiple sensors at test time. We also extend this method to a more general version called Actual Fusion within Virtual Fusion (AFVF), which uses a subset of training sensors during inference. Our method achieves state-of-the-art accuracy and F1-score on UCI-HAR and PAMAP2 benchmark datasets. Implementation is available upon request.

Results

TaskDatasetMetricValueModel
Activity RecognitionPAMAP2Accuracy0.9672AFVF
Activity RecognitionPAMAP2Macro F10.9665AFVF
Activity RecognitionHARAccuracy0.9861AFVF
Activity RecognitionHARMacro-F10.9865AFVF
Action DetectionPAMAP2Accuracy0.9672AFVF
Action DetectionPAMAP2Macro F10.9665AFVF
Action DetectionHARAccuracy0.9861AFVF
Action DetectionHARMacro-F10.9865AFVF
Human Activity RecognitionPAMAP2Accuracy0.9672AFVF
Human Activity RecognitionPAMAP2Macro F10.9665AFVF
Human Activity RecognitionHARAccuracy0.9861AFVF
Human Activity RecognitionHARMacro-F10.9865AFVF

Related Papers

SemCSE: Semantic Contrastive Sentence Embeddings Using LLM-Generated Summaries For Scientific Abstracts2025-07-17HapticCap: A Multimodal Dataset and Task for Understanding User Experience of Vibration Haptic Signals2025-07-17Overview of the TalentCLEF 2025: Skill and Job Title Intelligence for Human Capital Management2025-07-17SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16Data-Driven Meta-Analysis and Public-Dataset Evaluation for Sensor-Based Gait Age Estimation2025-07-15ZKP-FedEval: Verifiable and Privacy-Preserving Federated Evaluation using Zero-Knowledge Proofs2025-07-15LLM-Driven Dual-Level Multi-Interest Modeling for Recommendation2025-07-15