Search Results for author: Siyue Wu

Found 3 papers, 3 papers with code

MCC-KD: Multi-CoT Consistent Knowledge Distillation

1 code implementation23 Oct 2023 Hongzhan Chen, Siyue Wu, Xiaojun Quan, Rui Wang, Ming Yan, Ji Zhang

Large language models (LLMs) have showcased remarkable capabilities in complex reasoning through chain of thought (CoT) prompting.

Knowledge Distillation Mathematical Reasoning

AD-KD: Attribution-Driven Knowledge Distillation for Language Model Compression

1 code implementation17 May 2023 Siyue Wu, Hongzhan Chen, Xiaojun Quan, Qifan Wang, Rui Wang

To enhance the knowledge transfer of model reasoning and generalization, we further explore multi-view attribution distillation on all potential decisions of the teacher.

Knowledge Distillation Language Modelling +2

Directed Acyclic Graph Network for Conversational Emotion Recognition

1 code implementation ACL 2021 Weizhou Shen, Siyue Wu, Yunyi Yang, Xiaojun Quan

In this paper, we put forward a novel idea of encoding the utterances with a directed acyclic graph (DAG) to better model the intrinsic structure within a conversation, and design a directed acyclic neural network, namely DAG-ERC, to implement this idea.

Emotion Recognition in Conversation

Cannot find the paper you are looking for? You can Submit a new open access paper.