no code implementations • 15 Sep 2021 • Zimin Wan, Chenchen Xu, Hanna Suominen
The Bidirectional Encoder Representations from Transformers (BERT) model has achieved the state-of-the-art performance for many natural language processing (NLP) tasks.
Transfer Learning