TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Exploiting Emotional Dependencies with Graph Convolutional...

Exploiting Emotional Dependencies with Graph Convolutional Networks for Facial Expression Recognition

Panagiotis Antoniadis, Panagiotis P. Filntisis, Petros Maragos

2021-06-07Facial Expression RecognitionFacial Expression Recognition (FER)
PaperPDFCode(official)

Abstract

Over the past few years, deep learning methods have shown remarkable results in many face-related tasks including automatic facial expression recognition (FER) in-the-wild. Meanwhile, numerous models describing the human emotional states have been proposed by the psychology community. However, we have no clear evidence as to which representation is more appropriate and the majority of FER systems use either the categorical or the dimensional model of affect. Inspired by recent work in multi-label classification, this paper proposes a novel multi-task learning (MTL) framework that exploits the dependencies between these two models using a Graph Convolutional Network (GCN) to recognize facial expressions in-the-wild. Specifically, a shared feature representation is learned for both discrete and continuous recognition in a MTL setting. Moreover, the facial expression classifiers and the valence-arousal regressors are learned through a GCN that explicitly captures the dependencies between them. To evaluate the performance of our method under real-world conditions we perform extensive experiments on the AffectNet and Aff-Wild2 datasets. The results of our experiments show that our method is capable of improving the performance across different datasets and backbone architectures. Finally, we also surpass the previous state-of-the-art methods on the categorical model of AffectNet.

Results

TaskDatasetMetricValueModel
Facial Recognition and ModellingAffectNetAccuracy (7 emotion)66.46Emotion-GCN
Face ReconstructionAffectNetAccuracy (7 emotion)66.46Emotion-GCN
Facial Expression Recognition (FER)AffectNetAccuracy (7 emotion)66.46Emotion-GCN
3DAffectNetAccuracy (7 emotion)66.46Emotion-GCN
3D Face ModellingAffectNetAccuracy (7 emotion)66.46Emotion-GCN
3D Face ReconstructionAffectNetAccuracy (7 emotion)66.46Emotion-GCN

Related Papers

Multimodal Prompt Alignment for Facial Expression Recognition2025-06-26Enhancing Ambiguous Dynamic Facial Expression Recognition with Soft Label-based Data Augmentation2025-06-25Using Vision Language Models to Detect Students' Academic Emotion through Facial Expressions2025-06-12EfficientFER: EfficientNetv2 Based Deep Learning Approach for Facial Expression Recognition2025-06-02TKFNet: Learning Texture Key Factor Driven Feature for Facial Expression Recognition2025-05-15Unsupervised Multiview Contrastive Language-Image Joint Learning with Pseudo-Labeled Prompts Via Vision-Language Model for 3D/4D Facial Expression Recognition2025-05-14Achieving 3D Attention via Triplet Squeeze and Excitation Block2025-05-09Some Optimizers are More Equal: Understanding the Role of Optimizers in Group Fairness2025-04-21