Search Results for author: Ishan Mishra

Found 1 papers, 0 papers with code

Distilling Calibrated Student from an Uncalibrated Teacher

no code implementations22 Feb 2023 Ishan Mishra, Sethu Vamsi Krishna, Deepak Mishra

Knowledge distillation is a common technique for improving the performance of a shallow student network by transferring information from a teacher network, which in general, is comparatively large and deep.

Data Augmentation Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.