Knowledge-Aware Language Model Pretraining

29 Jun 2020Corby RossetChenyan XiongMinh PhanXia SongPaul BennettSaurabh Tiwary

How much knowledge do pretrained language models hold? Recent research observed that pretrained transformers are adept at modeling semantics but it is unclear to what degree they grasp human knowledge, or how to ensure they do so... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper