Search Results for author: Xinge Ma

Found 2 papers, 1 papers with code

Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression

1 code implementation COLING 2022 Xinge Ma, Jin Wang, Liang-Chih Yu, Xuejie Zhang

The teacher can continuously meta-learn the student’s learning objective to adjust its parameters for maximizing the student’s performance throughout the distillation process.

Knowledge Distillation Language Modelling +3

Cannot find the paper you are looking for? You can Submit a new open access paper.