Deepanway Ghosal, Navonil Majumder, Alexander Gelbukh, Rada Mihalcea, Soujanya Poria
In this paper, we address the task of utterance level emotion recognition in conversations using commonsense knowledge. We propose COSMIC, a new framework that incorporates different elements of commonsense such as mental states, events, and causal relations, and build upon them to learn interactions between interlocutors participating in a conversation. Current state-of-the-art methods often encounter difficulties in context propagation, emotion shift detection, and differentiating between related emotion classes. By learning distinct commonsense representations, COSMIC addresses these challenges and achieves new state-of-the-art results for emotion recognition on four different benchmark conversational datasets. Our code is available at https://github.com/declare-lab/conv-emotion.
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Emotion Recognition | EmoryNLP | Weighted-F1 | 38.11 | COSMIC |
| Emotion Recognition | MELD | Weighted-F1 | 65.21 | COSMIC |
| Emotion Recognition | DailyDialog | Macro F1 | 51.05 | COSMIC |
| Emotion Recognition | DailyDialog | Micro-F1 | 58.48 | COSMIC |
| Emotion Recognition | IEMOCAP | Weighted-F1 | 65.3 | COSMIC |