Search Results for author: Aryan Asadian

Found 1 papers, 1 papers with code

Distilling Knowledge via Intermediate Classifiers

2 code implementations28 Feb 2021 Aryan Asadian, Amirali Salehi-Abari

However, when there is a large difference between the model complexities of teacher and student (i. e., capacity gap), knowledge distillation loses its strength in transferring knowledge from the teacher to the student, thus training a weaker student.

Knowledge Distillation Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.