TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/EmoVerse: Exploring Multimodal Large Language Models for S...

EmoVerse: Exploring Multimodal Large Language Models for Sentiment and Emotion Understanding

Ao Li, Longwei Xu, Chen Ling, Jinghui Zhang, Pengwei Wang

2024-12-11Depression DetectionSentiment AnalysisEmotion-Cause Pair ExtractionMultimodal Emotion RecognitionFacial Expression RecognitionMultimodal Sentiment AnalysisEmotion Recognition
PaperPDFCode(official)Code(official)

Abstract

Sentiment and emotion understanding are essential to applications such as human-computer interaction and depression detection. While Multimodal Large Language Models (MLLMs) demonstrate robust general capabilities, they face considerable challenges in the field of affective computing, particularly in detecting subtle facial expressions and handling complex emotion-related tasks, such as emotion reason inference and understanding emotions in long-context scenarios. Furthermore, there is a lack of a unified MLLM that can effectively handle both sentiment and emotion-related tasks. To address these challenges, we explore multi-task training strategies for MLLMs in affective computing and introduce Emotion Universe (EmoVerse), an MLLM designed to handle a broad spectrum of sentiment and emotion-related tasks. In addition, EmoVerse is capable of deeply analyzing the underlying causes of emotional states. We also introduce the Affective Multitask (AMT) Dataset, which supports multimodal sentiment analysis, multimodal emotion recognition, facial expression recognition, emotion reason inference, and emotion cause-pair extraction tasks. Extensive experiments demonstrate that EmoVerse outperforms existing methods, achieving state-of-the-art results in sentiment and emotion-related tasks. The code is available at https://github.com/liaolea/EmoVerse.

Related Papers

Long-Short Distance Graph Neural Networks and Improved Curriculum Learning for Emotion Recognition in Conversation2025-07-21AdaptiSent: Context-Aware Adaptive Attention for Multimodal Aspect-Based Sentiment Analysis2025-07-17Camera-based implicit mind reading by capturing higher-order semantic dynamics of human gaze within environmental context2025-07-17AI Wizards at CheckThat! 2025: Enhancing Transformer-Based Embeddings with Sentiment for Subjectivity Detection in News Articles2025-07-15DCR: Quantifying Data Contamination in LLMs Evaluation2025-07-15A Robust Incomplete Multimodal Low-Rank Adaptation Approach for Emotion Recognition2025-07-15SentiDrop: A Multi Modal Machine Learning model for Predicting Dropout in Distance Learning2025-07-14Dynamic Parameter Memory: Temporary LoRA-Enhanced LLM for Long-Sequence Emotion Recognition in Conversation2025-07-11