TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/ProxEmo: Gait-based Emotion Learning and Multi-view Proxem...

ProxEmo: Gait-based Emotion Learning and Multi-view Proxemic Fusion for Socially-Aware Robot Navigation

Venkatraman Narayanan, Bala Murali Manoghar, Vishnu Sashank Dorbala, Dinesh Manocha, Aniket Bera

2020-03-02Emotion ClassificationPose EstimationGesture RecognitionRobot NavigationEmotion Recognition
PaperPDFCode(official)

Abstract

We present ProxEmo, a novel end-to-end emotion prediction algorithm for socially aware robot navigation among pedestrians. Our approach predicts the perceived emotions of a pedestrian from walking gaits, which is then used for emotion-guided navigation taking into account social and proxemic constraints. To classify emotions, we propose a multi-view skeleton graph convolution-based model that works on a commodity camera mounted onto a moving robot. Our emotion recognition is integrated into a mapless navigation scheme and makes no assumptions about the environment of pedestrian motion. It achieves a mean average emotion prediction precision of 82.47% on the Emotion-Gait benchmark dataset. We outperform current state-of-art algorithms for emotion recognition from 3D gaits. We highlight its benefits in terms of navigation in indoor scenes using a Clearpath Jackal robot.

Results

TaskDatasetMetricValueModel
Text ClassificationEWALKAccuracy82.4ProxEmo (ours)
Text ClassificationEWALKAccuracy78.24STEP [bhattacharya2019step]
Text ClassificationEWALKAccuracy55.47Baseline (Vanilla LSTM) [Ewalk]
Emotion ClassificationEWALKAccuracy82.4ProxEmo (ours)
Emotion ClassificationEWALKAccuracy78.24STEP [bhattacharya2019step]
Emotion ClassificationEWALKAccuracy55.47Baseline (Vanilla LSTM) [Ewalk]
ClassificationEWALKAccuracy82.4ProxEmo (ours)
ClassificationEWALKAccuracy78.24STEP [bhattacharya2019step]
ClassificationEWALKAccuracy55.47Baseline (Vanilla LSTM) [Ewalk]

Related Papers

Efficient Deployment of Spiking Neural Networks on SpiNNaker2 for DVS Gesture Recognition Using Neuromorphic Intermediate Representation2025-09-04Long-Short Distance Graph Neural Networks and Improved Curriculum Learning for Emotion Recognition in Conversation2025-07-21NonverbalTTS: A Public English Corpus of Text-Aligned Nonverbal Vocalizations with Emotion Annotations for Text-to-Speech2025-07-17$π^3$: Scalable Permutation-Equivariant Visual Geometry Learning2025-07-17Revisiting Reliability in the Reasoning-based Pose Estimation Benchmark2025-07-17DINO-VO: A Feature-based Visual Odometry Leveraging a Visual Foundation Model2025-07-17From Neck to Head: Bio-Impedance Sensing for Head Pose Estimation2025-07-17AthleticsPose: Authentic Sports Motion Dataset on Athletic Field and Evaluation of Monocular 3D Pose Estimation Ability2025-07-17