Transformer Based Language Models for Similar Text Retrieval and Ranking

10 May 2020Javed Qadrud-DinAshraf Bah RabiouRyan WalkerRavi SoniMartin GajekGabriel PackAkhil Rangaraj

Most approaches for similar text retrieval and ranking with long natural language queries rely at some level on queries and responses having words in common with each other. Recent applications of transformer-based neural language models to text retrieval and ranking problems have been very promising, but still involve a two-step process in which result candidates are first obtained through bag-of-words-based approaches, and then reranked by a neural transformer... (read more)

PDF Abstract


No code implementations yet. Submit your code now


Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper