NCUEE at MEDIQA 2019: Medical Text Inference Using Ensemble BERT-BiLSTM-Attention Model
This study describes the model design of the NCUEE system for the MEDIQA challenge at the ACL-BioNLP 2019 workshop. We use the BERT (Bidirectional Encoder Representations from Transformers) as the word embedding method to integrate the BiLSTM (Bidirectional Long Short-Term Memory) network with an attention mechanism for medical text inferences. A total of 42 teams participated in natural language inference task at MEDIQA 2019. Our best accuracy score of 0.84 ranked the top-third among all submissions in the leaderboard.
PDF AbstractDatasets
Add Datasets
introduced or used in this paper
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
Adam •
Attention Dropout •
BERT •
BiLSTM •
Dense Connections •
Dropout •
GELU •
Layer Normalization •
Linear Layer •
Linear Warmup With Linear Decay •
LSTM •
Multi-Head Attention •
Residual Connection •
Scaled Dot-Product Attention •
Sigmoid Activation •
Softmax •
Tanh Activation •
Weight Decay •
WordPiece