An Interaction-aware Attention Network for Speech Emotion Recognition in Spoken Dialogs

ICASSP 2019  ·  Sung-Lin Yeh, Yun-Shao Lin, Chi-Chun Lee ·

In this work, we propose an interaction-aware attention network (IAAN) that incorporate contextual information in the learned vocal representation through a novel attention mechanism. Our proposed method achieves 66.3% accuracy (7.9% over baseline methods) in four class emotion recognition and is also the current state-of-art recognition rates obtained on the benchmark database IEMOCAP.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here