Multimodal Emotion Recognition

53 papers with code • 3 benchmarks • 9 datasets

This is a leaderboard for multimodal emotion recognition on the IEMOCAP dataset. The modality abbreviations are A: Acoustic T: Text V: Visual

Please include the modality in the bracket after the model name.

All models must use standard five emotion categories and are evaluated in standard leave-one-session-out (LOSO). See the papers for references.

Libraries

Use these libraries to find Multimodal Emotion Recognition models and implementations

Cooperative Sentiment Agents for Multimodal Sentiment Analysis

smwanghhh/co-sa 19 Apr 2024

In this paper, we propose a new Multimodal Representation Learning (MRL) method for Multimodal Sentiment Analysis (MSA), which facilitates the adaptive interaction between modalities through Cooperative Sentiment Agents, named Co-SA.

1
19 Apr 2024

MIPS at SemEval-2024 Task 3: Multimodal Emotion-Cause Pair Extraction in Conversations with Multimodal Language Models

mips-colt/mer-mce 31 Mar 2024

This paper presents our winning submission to Subtask 2 of SemEval 2024 Task 3 on multimodal emotion cause analysis in conversations.

7
31 Mar 2024

Recursive Joint Cross-Modal Attention for Multimodal Fusion in Dimensional Emotion Recognition

praveena2j/rjcma 20 Mar 2024

In particular, we compute the attention weights based on cross-correlation between the joint audio-visual-text feature representations and the feature representations of individual modalities to simultaneously capture intra- and intermodal relationships across the modalities.

2
20 Mar 2024

Joint Multimodal Transformer for Emotion Recognition in the Wild

PoloWlg/Joint-Multimodal-Transformer-6th-ABAW 15 Mar 2024

Multimodal emotion recognition (MMER) systems typically outperform unimodal systems by leveraging the inter- and intra-modal relationships between, e. g., visual, textual, physiological, and auditory modalities.

1
15 Mar 2024

Curriculum Learning Meets Directed Acyclic Graph for Multimodal Emotion Recognition

vanntc711/multidag-cl 27 Feb 2024

Emotion recognition in conversation (ERC) is a crucial task in natural language processing and affective computing.

2
27 Feb 2024

Modality-Collaborative Transformer with Hybrid Feature Reconstruction for Robust Emotion Recognition

zxpoqas123/MCT-HFR 26 Dec 2023

As a vital aspect of affective computing, Multimodal Emotion Recognition has been an active research area in the multimedia community.

0
26 Dec 2023

GPT-4V with Emotion: A Zero-shot Benchmark for Generalized Emotion Recognition

zeroqiaoba/gpt4v-emotion 7 Dec 2023

To bridge this gap, we present the quantitative evaluation results of GPT-4V on 21 benchmark datasets covering 6 tasks: visual sentiment analysis, tweet sentiment analysis, micro-expression recognition, facial emotion recognition, dynamic facial emotion recognition, and multimodal emotion recognition.

70
07 Dec 2023

eMotions: A Large-Scale Dataset for Emotion Recognition in Short Videos

xuecwu/emotions 29 Nov 2023

The prevailing use of SVs to spread emotions leads to the necessity of emotion recognition in SVs.

21
29 Nov 2023

Conversation Understanding using Relational Temporal Graph Neural Networks with Auxiliary Cross-Modality Interaction

leson502/CORECT_EMNLP2023 8 Nov 2023

Emotion recognition is a crucial task for human conversation understanding.

24
08 Nov 2023

A Transformer-Based Model With Self-Distillation for Multimodal Emotion Recognition in Conversations

butterfliesss/sdt 31 Oct 2023

Emotion recognition in conversations (ERC), the task of recognizing the emotion of each utterance in a conversation, is crucial for building empathetic machines.

21
31 Oct 2023