no code implementations • SEMEVAL 2020 • Wah Meng Lim, Harish Tayyar Madabushi
Pre-trained language model word representation, such as BERT, have been extremely successful in several Natural Language Processing tasks significantly improving on the state-of-the-art.