MC-BERT: Efficient Language Pre-Training via a Meta Controller

10 Jun 2020Zhenhui XuLinyuan GongGuolin KeDi HeShuxin ZhengLiwei WangJiang BianTie-Yan Liu

Pre-trained contextual representations (e.g., BERT) have become the foundation to achieve state-of-the-art results on many NLP tasks. However, large-scale pre-training is computationally expensive... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper