no code implementations • 20 Jun 2019 • Wei Hong, Jin ke Yu Fan Zong
Many researchers use the knowledge distillation method to improve the accuracy of student networks by transferring knowledge from a deeper and larger teachers network to a small student network, in object detection.