no code implementations • 22 Feb 2023 • Ishan Mishra, Sethu Vamsi Krishna, Deepak Mishra
Knowledge distillation is a common technique for improving the performance of a shallow student network by transferring information from a teacher network, which in general, is comparatively large and deep.