DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis
This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis, demonstrating promising results.
PDF Abstract Findings of 2020 PDF Findings of 2020 AbstractCode
Datasets
Add Datasets
introduced or used in this paper
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
Adam •
Attention Dropout •
BERT •
BiLSTM •
Dense Connections •
Dropout •
ELMo •
GELU •
Layer Normalization •
Linear Layer •
Linear Warmup With Linear Decay •
LSTM •
Multi-Head Attention •
Residual Connection •
Scaled Dot-Product Attention •
Sigmoid Activation •
Softmax •
Tanh Activation •
Weight Decay •
WordPiece