TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/UniMSE: Towards Unified Multimodal Sentiment Analysis and ...

UniMSE: Towards Unified Multimodal Sentiment Analysis and Emotion Recognition

Guimin Hu, Ting-En Lin, Yi Zhao, Guangming Lu, Yuchuan Wu, Yongbin Li

2022-11-21Emotion Recognition in ConversationSentiment AnalysisContrastive LearningMultimodal Sentiment AnalysisEmotion Recognition
PaperPDFCode(official)

Abstract

Multimodal sentiment analysis (MSA) and emotion recognition in conversation (ERC) are key research topics for computers to understand human behaviors. From a psychological perspective, emotions are the expression of affect or feelings during a short period, while sentiments are formed and held for a longer period. However, most existing works study sentiment and emotion separately and do not fully exploit the complementary knowledge behind the two. In this paper, we propose a multimodal sentiment knowledge-sharing framework (UniMSE) that unifies MSA and ERC tasks from features, labels, and models. We perform modality fusion at the syntactic and semantic levels and introduce contrastive learning between modalities and samples to better capture the difference and consistency between sentiments and emotions. Experiments on four public benchmark datasets, MOSI, MOSEI, MELD, and IEMOCAP, demonstrate the effectiveness of the proposed method and achieve consistent improvements compared with state-of-the-art methods.

Results

TaskDatasetMetricValueModel
Emotion RecognitionMELDAccuracy65.09UniMSE
Emotion RecognitionMELDWeighted-F165.51UniMSE
Emotion RecognitionIEMOCAPAccuracy70.56UniMSE
Emotion RecognitionIEMOCAPWeighted-F170.66UniMSE
Sentiment AnalysisMOSIAccuracy86.9UniMSE
Sentiment AnalysisMOSIF1 score86.42UniMSE
Sentiment AnalysisCMU-MOSEIAccuracy87.5UniMSE
Sentiment AnalysisCMU-MOSEIF187.46UniMSE
Sentiment AnalysisCMU-MOSEIMAE0.523UniMSE
Sentiment AnalysisCMU-MOSIAcc-286.9UniMSE
Sentiment AnalysisCMU-MOSIAcc-748.68UniMSE
Sentiment AnalysisCMU-MOSICorr0.809UniMSE
Sentiment AnalysisCMU-MOSIF186.42UniMSE
Sentiment AnalysisCMU-MOSIMAE0.691UniMSE

Related Papers

Long-Short Distance Graph Neural Networks and Improved Curriculum Learning for Emotion Recognition in Conversation2025-07-21AdaptiSent: Context-Aware Adaptive Attention for Multimodal Aspect-Based Sentiment Analysis2025-07-17SemCSE: Semantic Contrastive Sentence Embeddings Using LLM-Generated Summaries For Scientific Abstracts2025-07-17HapticCap: A Multimodal Dataset and Task for Understanding User Experience of Vibration Haptic Signals2025-07-17Overview of the TalentCLEF 2025: Skill and Job Title Intelligence for Human Capital Management2025-07-17SGCL: Unifying Self-Supervised and Supervised Learning for Graph Recommendation2025-07-17Camera-based implicit mind reading by capturing higher-order semantic dynamics of human gaze within environmental context2025-07-17Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16