Browse SoTA > Computer Vision > Emotion Recognition

Emotion Recognition

107 papers with code · Computer Vision

Emotion Recognition is an important area of research to enable effective human-computer interaction. Human emotions can be detected using speech signal, facial expressions, body language, and electroencephalography (EEG). Source: Using Deep Autoencoders for Facial Expression Recognition

Benchmarks

Greatest papers with code

COSMIC: COmmonSense knowledge for eMotion Identification in Conversations

6 Oct 2020declare-lab/conv-emotion

In this paper, we address the task of utterance level emotion recognition in conversations using commonsense knowledge.

EMOTION RECOGNITION IN CONVERSATION

Conversational Transfer Learning for Emotion Recognition

11 Oct 2019SenticNet/conv-emotion

We propose an approach, TL-ERC, where we pre-train a hierarchical dialogue model on multi-turn conversations (source) and then transfer its parameters to a conversational emotion classifier (target).

EMOTION RECOGNITION IN CONVERSATION TRANSFER LEARNING

DialogueGCN: A Graph Convolutional Neural Network for Emotion Recognition in Conversation

IJCNLP 2019 SenticNet/conv-emotion

Emotion recognition in conversation (ERC) has received much attention, lately, from researchers due to its potential widespread applications in diverse areas, such as health-care, education, and human resources.

EMOTION CLASSIFICATION EMOTION RECOGNITION IN CONVERSATION

Emotion Recognition in Conversation: Research Challenges, Datasets, and Recent Advances

8 May 2019SenticNet/conv-emotion

Emotion is intrinsic to humans and consequently emotion understanding is a key part of human-like artificial intelligence (AI).

EMOTION RECOGNITION IN CONVERSATION

ExpNet: Landmark-Free, Deep, 3D Facial Expressions

2 Feb 2018fengju514/Expression-Net

Our ExpNet CNN is applied directly to the intensities of a face image and regresses a 29D vector of 3D expression coefficients.

 Ranked #1 on 3D Facial Expression Recognition on 2017_test set (using extra training data)

3D FACIAL EXPRESSION RECOGNITION EMOTION RECOGNITION FACIAL LANDMARK DETECTION

Words Can Shift: Dynamically Adjusting Word Representations Using Nonverbal Behaviors

23 Nov 2018A2Zadeh/CMU-MultimodalDataSDK

Humans convey their intentions through the usage of both verbal and nonverbal behaviors during face-to-face communication.

EMOTION RECOGNITION MULTIMODAL SENTIMENT ANALYSIS

MELD: A Multimodal Multi-Party Dataset for Emotion Recognition in Conversations

ACL 2019 SenticNet/MELD

We propose several strong multimodal baselines and show the importance of contextual and multimodal information for emotion recognition in conversations.

DIALOGUE GENERATION EMOTION RECOGNITION IN CONVERSATION