Variational Semi-supervised Aspect-term Sentiment Analysis via Transformer

CONLL 2019  ·  Xingyi Cheng, Weidi Xu, Taifeng Wang, Wei Chu ·

Aspect-term sentiment analysis (ATSA) is a longstanding challenge in natural language understanding. It requires fine-grained semantical reasoning about a target entity appeared in the text. As manual annotation over the aspects is laborious and time-consuming, the amount of labeled data is limited for supervised learning. This paper proposes a semi-supervised method for the ATSA problem by using the Variational Autoencoder based on Transformer (VAET), which models the latent distribution via variational inference. By disentangling the latent representation into the aspect-specific sentiment and the lexical context, our method induces the underlying sentiment prediction for the unlabeled data, which then benefits the ATSA classifier. Our method is classifier agnostic, i.e., the classifier is an independent module and various advanced supervised models can be integrated. Experimental results are obtained on the SemEval 2014 task 4 and show that our method is effective with four classical classifiers. The proposed method outperforms two general semisupervised methods and achieves state-of-the-art performance.

PDF Abstract CONLL 2019 PDF CONLL 2019 Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Aspect-Based Sentiment Analysis (ABSA) SemEval-2014 Task-4 BILSTM-ATT-G (TSSVAE) Restaurant (Acc) 81.10 # 29
Laptop (Acc) 75.34 # 23
Mean Acc (Restaurant + Laptop) 78.22 # 24

Methods