Search Results for author: Yasuto Hoshi

Found 3 papers, 2 papers with code

RaLLe: A Framework for Developing and Evaluating Retrieval-Augmented Large Language Models

1 code implementation21 Aug 2023 Yasuto Hoshi, Daisuke Miyashita, Youyang Ng, Kento Tatsuno, Yasuhiro Morioka, Osamu Torii, Jun Deguchi

Retrieval-augmented large language models (R-LLMs) combine pre-trained large language models (LLMs) with information retrieval systems to improve the accuracy of factual question-answering.

Information Retrieval Question Answering +1

Can a Frozen Pretrained Language Model be used for Zero-shot Neural Retrieval on Entity-centric Questions?

no code implementations9 Mar 2023 Yasuto Hoshi, Daisuke Miyashita, Yasuhiro Morioka, Youyang Ng, Osamu Torii, Jun Deguchi

However, it has been shown that the existing dense retrievers do not generalize well not only out of domain but even in domain such as Wikipedia, especially when a named entity in a question is a dominant clue for retrieval.

Domain Generalization Language Modelling +3

Cannot find the paper you are looking for? You can Submit a new open access paper.