no code implementations • 12 Mar 2021 • Zhenyan Hou, Wenxuan Fan
Knowledge distillation is the process of transferring the knowledge from a large model to a small model.
Knowledge Distillation