Search Results for author: Wangshu Zhang

Found 4 papers, 0 papers with code

AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation

no code implementations26 Dec 2023 Junjie Wang, Yicheng Chen, Wangshu Zhang, Sen Hu, Teng Xu, Jing Zheng

In the second stage, we distill the knowledge from the existing teacher adapters into the student adapter to help its inference.

Knowledge Distillation Retrieval

From Beginner to Expert: Modeling Medical Knowledge into General LLMs

no code implementations2 Dec 2023 Qiang Li, Xiaoyan Yang, Haowen Wang, Qin Wang, Lei Liu, Junjie Wang, Yang Zhang, Mingyuan Chu, Sen Hu, Yicheng Chen, Yue Shen, Cong Fan, Wangshu Zhang, Teng Xu, Jinjie Gu, Jing Zheng, Guannan Zhang Ant Group

(3) Specifically for multi-choice questions in the medical domain, we propose a novel Verification-of-Choice approach for prompting engineering, which significantly enhances the reasoning ability of LLMs.

Language Modelling Large Language Model +3

Query Distillation: BERT-based Distillation for Ensemble Ranking

no code implementations COLING 2020 Wangshu Zhang, Junhong Liu, Zujie Wen, Yafang Wang, Gerard de Melo

We present a novel two-stage distillation method for ranking problems that allows a smaller student model to be trained while benefitting from the better performance of the teacher model, providing better control of the inference latency and computational burden.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.