Search Results for author: Xiangyu Hong

Found 1 papers, 0 papers with code

On Large Language Models' Hallucination with Regard to Known Facts

no code implementations29 Mar 2024 Che Jiang, Biqing Qi, Xiangyu Hong, Dayuan Fu, Yang Cheng, Fandong Meng, Mo Yu, BoWen Zhou, Jie zhou

In hallucinated cases, the output token's information rarely demonstrates abrupt increases and consistent superiority in the later stages of the model.

Hallucination

Cannot find the paper you are looking for? You can Submit a new open access paper.