Multimodal Emotion Recognition

48 papers with code • 3 benchmarks • 8 datasets

This is a leaderboard for multimodal emotion recognition on the IEMOCAP dataset. The modality abbreviations are A: Acoustic T: Text V: Visual

Please include the modality in the bracket after the model name.

All models must use standard five emotion categories and are evaluated in standard leave-one-session-out (LOSO). See the papers for references.

Libraries

Use these libraries to find Multimodal Emotion Recognition models and implementations

Most implemented papers

Multimodal Sentiment Analysis using Hierarchical Fusion with Context Modeling

SenticNet/hfusion 16 Jun 2018

Multimodal sentiment analysis is a very actively growing field of research.

Investigation of Multimodal Features, Classifiers and Fusion Methods for Emotion Recognition

zeroQiaoba/EmotiW2018 13 Sep 2018

We test our method in the EmotiW 2018 challenge and we gain promising results.

Music Mood Detection Based On Audio And Lyrics With Deep Neural Net

Dohppak/Music-Emotion-Recognition-Classification 19 Sep 2018

We consider the task of multimodal music mood prediction based on the audio signal and the lyrics of a track.

ICON: Interactive Conversational Memory Network for Multimodal Emotion Detection

SenticNet/conv-emotion EMNLP 2018

Emotion recognition in conversations is crucial for building empathetic machines.

Multimodal Emotion Recognition Using Deep Canonical Correlation Analysis

csliuwei/MI_plot 13 Aug 2019

We evaluate the performance of DCCA on five multimodal datasets: the SEED, SEED-IV, SEED-V, DEAP, and DREAMER datasets.

Learning Alignment for Multimodal Emotion Recognition from Speech

ZhiqiWang12-hash/text_audio_classification 6 Sep 2019

Further, emotion recognition will be beneficial from using audio-textual multimodal information, it is not trivial to build a system to learn from multimodality.

Multimodal Behavioral Markers Exploring Suicidal Intent in Social Media Videos

ankitshah009/icmi_19_suicidal_intent_detection International Conference on Multimodal Interaction 2019

In this work, we set out to study multimodal behavioral markers related to suicidal intent when expressed on social media videos.

Attentive Modality Hopping Mechanism for Speech Emotion Recognition

david-yoon/attentive-modality-hopping-for-SER 29 Nov 2019

In this work, we explore the impact of visual modality in addition to speech and text for improving the accuracy of the emotion detection system.

Multilogue-Net: A Context Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in Conversation

amanshenoy/multilogue-net arXiv preprint 2020

Sentiment Analysis and Emotion Detection in conversation is key in several real-world applications, with an increase in modalities available aiding a better understanding of the underlying emotions.