Search Results for author: Baitan Shao

Found 2 papers, 2 papers with code

Decoupled Knowledge with Ensemble Learning for Online Distillation

1 code implementation18 Dec 2023 Baitan Shao, Ying Chen

To obtain early decoupled knowledge, an initialization scheme for the teacher is devised, and a 2D geometry-based analysis experiment is conducted under ideal conditions to showcase the effectiveness of this scheme.

Ensemble Learning Knowledge Distillation

Multi-granularity for knowledge distillation

1 code implementation15 Aug 2021 Baitan Shao, Ying Chen

Considering the fact that students have different abilities to understand the knowledge imparted by teachers, a multi-granularity distillation mechanism is proposed for transferring more understandable knowledge for student networks.

Knowledge Distillation Person Re-Identification

Cannot find the paper you are looking for? You can Submit a new open access paper.