Search Results for author: Moon-Hyun Cha

Found 1 papers, 0 papers with code

Learning Student-Friendly Teacher Networks for Knowledge Distillation

no code implementations NeurIPS 2021 Dae Young Park, Moon-Hyun Cha, Changwook Jeong, Dae Sin Kim, Bohyung Han

In other words, at the time of optimizing a teacher model, the proposed algorithm learns the student branches jointly to obtain student-friendly representations.

Knowledge Distillation Transfer Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.