Aspect Level Sentiment Classification with Deep Memory Network

EMNLP 2016  ·  Duyu Tang, Bing Qin, Ting Liu ·

We introduce a deep memory network for aspect level sentiment classification. Unlike feature-based SVM and sequential neural models such as LSTM, this approach explicitly captures the importance of each context word when inferring the sentiment polarity of an aspect. Such importance degree and text representation are calculated with multiple computational layers, each of which is a neural attention model over an external memory. Experiments on laptop and restaurant datasets demonstrate that our approach performs comparable to state-of-art feature based SVM system, and substantially better than LSTM and attention-based LSTM architectures. On both datasets we show that multiple computational layers could improve the performance. Moreover, our approach is also fast. The deep memory network with 9 layers is 15 times faster than LSTM with a CPU implementation.

PDF Abstract EMNLP 2016 PDF EMNLP 2016 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Aspect-Based Sentiment Analysis (ABSA) SemEval-2014 Task-4 MemNet Restaurant (Acc) 80.95 # 31
Laptop (Acc) 72.21 # 33
Mean Acc (Restaurant + Laptop) 76.58 # 31

Methods