no code implementations • 8 Mar 2024 • Eda Yilmaz, Hacer Yalim Keles
Knowledge Distillation (KD) facilitates the transfer of discriminative capabilities from an advanced teacher model to a simpler student model, ensuring performance enhancement without compromising accuracy.