Search Results for author: Wah Meng Lim

Found 1 papers, 0 papers with code

UoB at SemEval-2020 Task 12: Boosting BERT with Corpus Level Information

no code implementations SEMEVAL 2020 Wah Meng Lim, Harish Tayyar Madabushi

Pre-trained language model word representation, such as BERT, have been extremely successful in several Natural Language Processing tasks significantly improving on the state-of-the-art.

Abuse Detection Language Modelling +1

Cannot find the paper you are looking for? You can Submit a new open access paper.