Dialogue Act Classification with Context-Aware Self-Attention

NAACL 2019  ·  Vipul Raheja, Joel Tetreault ·

Recent work in Dialogue Act classification has treated the task as a sequence labeling problem using hierarchical deep neural networks. We build on this prior work by leveraging the effectiveness of a context-aware self-attention mechanism coupled with a hierarchical recurrent neural network. We conduct extensive evaluations on standard Dialogue Act classification datasets and show significant improvement over state-of-the-art results on the Switchboard Dialogue Act (SwDA) Corpus. We also investigate the impact of different utterance-level representation learning methods and show that our method is effective at capturing utterance-level semantic text representations while maintaining high accuracy.

PDF Abstract NAACL 2019 PDF NAACL 2019 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Dialogue Act Classification ICSI Meeting Recorder Dialog Act (MRDA) corpus Bi-RNN + Self-Attention + Context Accuracy 91.1 # 6
Dialogue Act Classification Switchboard corpus Bi-RNN + Self-Attention + Context Accuracy 82.9 # 3

Methods


No methods listed for this paper. Add relevant methods here