PROP: Pre-training with Representative Words Prediction for Ad-hoc Retrieval

20 Oct 2020 Xinyu Ma Jiafeng Guo Ruqing Zhang Yixing Fan Xiang Ji Xueqi Cheng

Recently pre-trained language representation models such as BERT have shown great success when fine-tuned on downstream tasks including information retrieval (IR). However, pre-training objectives tailored for ad-hoc retrieval have not been well explored... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper