TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Action Recognition in Real-World Ambient Assisted Living E...

Action Recognition in Real-World Ambient Assisted Living Environment

Vincent Gbouna Zakka, Zhuangzhuang Dai, Luis J. Manso

2025-03-29Skeleton Based Action RecognitionData AugmentationHand Gesture RecognitionAction Recognition
PaperPDFCode(official)

Abstract

The growing ageing population and their preference to maintain independence by living in their own homes require proactive strategies to ensure safety and support. Ambient Assisted Living (AAL) technologies have emerged to facilitate ageing in place by offering continuous monitoring and assistance within the home. Within AAL technologies, action recognition plays a crucial role in interpreting human activities and detecting incidents like falls, mobility decline, or unusual behaviours that may signal worsening health conditions. However, action recognition in practical AAL applications presents challenges, including occlusions, noisy data, and the need for real-time performance. While advancements have been made in accuracy, robustness to noise, and computation efficiency, achieving a balance among them all remains a challenge. To address this challenge, this paper introduces the Robust and Efficient Temporal Convolution network (RE-TCN), which comprises three main elements: Adaptive Temporal Weighting (ATW), Depthwise Separable Convolutions (DSC), and data augmentation techniques. These elements aim to enhance the model's accuracy, robustness against noise and occlusion, and computational efficiency within real-world AAL contexts. RE-TCN outperforms existing models in terms of accuracy, noise and occlusion robustness, and has been validated on four benchmark datasets: NTU RGB+D 60, Northwestern-UCLA, SHREC'17, and DHG-14/28. The code is publicly available at: https://github.com/Gbouna/RE-TCN

Results

TaskDatasetMetricValueModel
VideoSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
VideoSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
VideoSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN
Temporal Action LocalizationSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
Temporal Action LocalizationSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
Temporal Action LocalizationSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN
Zero-Shot LearningSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
Zero-Shot LearningSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
Zero-Shot LearningSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN
Activity RecognitionSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
Activity RecognitionSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
Activity RecognitionSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN
Action LocalizationSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
Action LocalizationSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
Action LocalizationSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN
HandDHG-28Accuracy88.21RE-TCN
HandDHG-14Accuracy91.31RE-TCN
Action DetectionSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
Action DetectionSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
Action DetectionSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN
Gesture RecognitionDHG-28Accuracy88.21RE-TCN
Gesture RecognitionDHG-14Accuracy91.31RE-TCN
3D Action RecognitionSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
3D Action RecognitionSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
3D Action RecognitionSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN
Action RecognitionSHREC 2017 track on 3D Hand Gesture Recognition14 gestures accuracy99.85RE-TCN
Action RecognitionSHREC 2017 track on 3D Hand Gesture Recognition28 gestures accuracy99.95RE-TCN
Action RecognitionSHREC 2017 track on 3D Hand Gesture RecognitionNo. Parameters1.25RE-TCN

Related Papers

Overview of the TalentCLEF 2025: Skill and Job Title Intelligence for Human Capital Management2025-07-17Pixel Perfect MegaMed: A Megapixel-Scale Vision-Language Foundation Model for Generating High Resolution Medical Images2025-07-17A Real-Time System for Egocentric Hand-Object Interaction Detection in Industrial Domains2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16Data Augmentation in Time Series Forecasting through Inverted Framework2025-07-15Iceberg: Enhancing HLS Modeling with Synthetic Data2025-07-14AI-Enhanced Pediatric Pneumonia Detection: A CNN-Based Approach Using Data Augmentation and Generative Adversarial Networks (GANs)2025-07-13FreeAudio: Training-Free Timing Planning for Controllable Long-Form Text-to-Audio Generation2025-07-11