Emotion Recognition

465 papers with code • 7 benchmarks • 45 datasets

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Most implemented papers

EmoTxt: A Toolkit for Emotion Recognition from Text

collab-uniba/Emotion_and_Polarity_SO 13 Aug 2017

We provide empirical evidence of the performance of EmoTxt.

Multi-attention Recurrent Network for Human Communication Comprehension

pliang279/MFN 3 Feb 2018

AI must understand each modality and the interactions between them that shape human communication.

Multi-Modal Emotion recognition on IEMOCAP Dataset using Deep Learning

Samarth-Tripathi/IEMOCAP-Emotion-Detection 16 Apr 2018

Emotion recognition has become an important field of research in Human Computer Interactions as we improve upon the techniques for modelling the various aspects of behaviour.

Multimodal Utterance-level Affect Analysis using Visual, Audio and Text Features

toxtli/AutomEditor 2 May 2018

The integration of information across multiple modalities and across time is a promising way to enhance the emotion recognition performance of affective systems.

Classifying and Visualizing Emotions with Emotional DAN

IvonaTau/emotionaldan 23 Oct 2018

Classification of human emotions remains an important and challenging task for many computer vision algorithms, especially in the era of humanoid robots which coexist with humans in their everyday life.

Where is Your Evidence: Improving Fact-checking by Justification Modeling

Tariq60/LIAR-PLUS WS 2018

Fact-checking is a journalistic practice that compares a claim made publicly against trusted sources of facts.

Hide-and-Seek: A Data Augmentation Technique for Weakly-Supervised Localization and Beyond

kkanshul/Hide-and-Seek 6 Nov 2018

Our approach only needs to modify the input image and can work with any network to improve its performance.

A Compact Embedding for Facial Expression Similarity

AmirSh15/FECNet CVPR 2019

Most of the existing work on automatic facial expression analysis focuses on discrete emotion recognition, or facial action unit detection.

Speech Emotion Recognition Using Multi-hop Attention Mechanism

raulsteleac/Speech_Emotion_Recognition 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019

As opposed to using knowledge from both the modalities separately, we propose a framework to exploit acoustic information in tandem with lexical data.

Remote Photoplethysmograph Signal Measurement from Facial Videos Using Spatio-Temporal Networks

terbed/Deep-rPPG 7 May 2019

Recent studies demonstrated that the average heart rate (HR) can be measured from facial videos based on non-contact remote photoplethysmography (rPPG).