Search Results for author: Caiyu Wang

Found 1 papers, 0 papers with code

Mixed Distillation Helps Smaller Language Model Better Reasoning

no code implementations17 Dec 2023 Chenglin Li, Qianglong Chen, Liangyue Li, Caiyu Wang, Yicheng Li, Zulong Chen, Yin Zhang

While large language models (LLMs) have demonstrated exceptional performance in recent natural language processing (NLP) tasks, their deployment poses substantial challenges due to high computational and memory demands in real-world applications.

Knowledge Distillation Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.