no code implementations • 3 Jul 2020 • Shipeng Fu, Zhen Li, Jun Xu, Ming-Ming Cheng, Zitao Liu, Xiaomin Yang
Knowledge distillation is a standard teacher-student learning framework to train a light-weight student network under the guidance of a well-trained large teacher network.