Search Results for author: Liwei Peng

Found 1 papers, 0 papers with code

StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding

no code implementations ICLR 2020 Wei Wang, Bin Bi, Ming Yan, Chen Wu, Zuyi Bao, Jiangnan Xia, Liwei Peng, Luo Si

Recently, the pre-trained language model, BERT (and its robustly optimized version RoBERTa), has attracted a lot of attention in natural language understanding (NLU), and achieved state-of-the-art accuracy in various NLU tasks, such as sentiment classification, natural language inference, semantic textual similarity and question answering.

Language Modelling Linguistic Acceptability +7

Cannot find the paper you are looking for? You can Submit a new open access paper.