1 code implementation • 19 Jul 2020 • Diego de Vargas Feijo, Viviane Pereira Moreira
BERT (Bidirectional Encoder Representations from Transformers) and ALBERT (A Lite BERT) are methods for pre-training language models which can later be fine-tuned for a variety of Natural Language Understanding tasks.