TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/MM-DFN: Multimodal Dynamic Fusion Network for Emotion Reco...

MM-DFN: Multimodal Dynamic Fusion Network for Emotion Recognition in Conversations

Dou Hu, Xiaolong Hou, Lingwei Wei, Lianxin Jiang, Yang Mo

2022-03-04Emotion Recognition in ConversationEmotion Recognition
PaperPDFCode(official)

Abstract

Emotion Recognition in Conversations (ERC) has considerable prospects for developing empathetic machines. For multimodal ERC, it is vital to understand context and fuse modality information in conversations. Recent graph-based fusion methods generally aggregate multimodal information by exploring unimodal and cross-modal interactions in a graph. However, they accumulate redundant information at each layer, limiting the context understanding between modalities. In this paper, we propose a novel Multimodal Dynamic Fusion Network (MM-DFN) to recognize emotions by fully understanding multimodal conversational context. Specifically, we design a new graph-based dynamic fusion module to fuse multimodal contextual features in a conversation. The module reduces redundancy and enhances complementarity between modalities by capturing the dynamics of contextual information in different semantic spaces. Extensive experiments on two public benchmark datasets demonstrate the effectiveness and superiority of MM-DFN.

Results

TaskDatasetMetricValueModel
Emotion RecognitionCMU-MOSEI-SentimentAccuracy45.29MM-DFN
Emotion RecognitionCMU-MOSEI-SentimentWeighted F142.98MM-DFN
Emotion RecognitionIEMOCAP-4Accuracy80.91MM-DFN
Emotion RecognitionIEMOCAP-4Weighted F180.83MM-DFN
Emotion RecognitionMELDAccuracy62.49MM-DFN
Emotion RecognitionMELDWeighted-F159.46MM-DFN
Emotion RecognitionIEMOCAPAccuracy68.21MM-DFN
Emotion RecognitionIEMOCAPWeighted-F168.18MM-DFN

Related Papers

Long-Short Distance Graph Neural Networks and Improved Curriculum Learning for Emotion Recognition in Conversation2025-07-21Camera-based implicit mind reading by capturing higher-order semantic dynamics of human gaze within environmental context2025-07-17A Robust Incomplete Multimodal Low-Rank Adaptation Approach for Emotion Recognition2025-07-15Dynamic Parameter Memory: Temporary LoRA-Enhanced LLM for Long-Sequence Emotion Recognition in Conversation2025-07-11CAST-Phys: Contactless Affective States Through Physiological signals Database2025-07-08Exploring Remote Physiological Signal Measurement under Dynamic Lighting Conditions at Night: Dataset, Experiment, and Analysis2025-07-06How to Retrieve Examples in In-context Learning to Improve Conversational Emotion Recognition using Large Language Models?2025-06-25MATER: Multi-level Acoustic and Textual Emotion Representation for Interpretable Speech Emotion Recognition2025-06-24