Search Results for author: Run-Ze Fan

Found 5 papers, 4 papers with code

Reformatted Alignment

1 code implementation19 Feb 2024 Run-Ze Fan, Xuefeng Li, Haoyang Zou, Junlong Li, Shwai He, Ethan Chern, Jiewen Hu, PengFei Liu

This paper explores elevating the quality of existing instruction data to better align with human values, introducing a simple and effective approach named ReAlign, which reformats the responses of instruction data into a format that better aligns with pre-established criteria and the collated evidence.

GSM8K Hallucination +2

RIGHT: Retrieval-augmented Generation for Mainstream Hashtag Recommendation

1 code implementation16 Dec 2023 Run-Ze Fan, Yixing Fan, Jiangui Chen, Jiafeng Guo, Ruqing Zhang, Xueqi Cheng

Automatic mainstream hashtag recommendation aims to accurately provide users with concise and popular topical hashtags before publication.

Retrieval

Merging Experts into One: Improving Computational Efficiency of Mixture of Experts

1 code implementation15 Oct 2023 Shwai He, Run-Ze Fan, Liang Ding, Li Shen, Tianyi Zhou, DaCheng Tao

Although a sparse Mixture of Experts (MoE) can reduce the cost by activating a small subset of parameters (e. g., one expert) for each input, its computation escalates significantly if increasing the number of activated experts, limiting its practical utility.

Computational Efficiency

Generative Judge for Evaluating Alignment

1 code implementation9 Oct 2023 Junlong Li, Shichao Sun, Weizhe Yuan, Run-Ze Fan, Hai Zhao, PengFei Liu

The rapid development of Large Language Models (LLMs) has substantially expanded the range of tasks they can address.

MerA: Merging Pretrained Adapters For Few-Shot Learning

no code implementations30 Aug 2023 Shwai He, Run-Ze Fan, Liang Ding, Li Shen, Tianyi Zhou, DaCheng Tao

Adapter tuning, which updates only a few parameters, has become a mainstream method for fine-tuning pretrained language models to downstream tasks.

Few-Shot Learning MRPC

Cannot find the paper you are looking for? You can Submit a new open access paper.