Search Results for author: Brian MacNamee

Found 1 papers, 0 papers with code

Investigating the Effectiveness of Representations Based on Pretrained Transformer-based Language Models in Active Learning for Labelling Text Datasets

no code implementations21 Apr 2020 Jinghui Lu, Brian MacNamee

While simple vector representations such as bag-of-words and embedding-based representations based on techniques such as word2vec have been shown to be an effective way to represent documents during active learning, the emergence of representation mechanisms based on the pre-trained transformer-based neural network models popular in natural language processing research (e. g. BERT) offer a promising, and as yet not fully explored, alternative.

Active Learning Word Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.