no code implementations • 25 May 2023 • Hyeongrok Han, Siwon Kim, Hyun-Soo Choi, Sungroh Yoon
Several recent studies have elucidated why knowledge distillation (KD) improves model performance.
Knowledge Distillation