Backdoor Defense for Data-Free Distillation with Poisoned Teachers

1 papers with code • 0 benchmarks • 0 datasets

Defend against backdoor attack from poisoned teachers.

Most implemented papers

Revisiting Data-Free Knowledge Distillation with Poisoned Teachers

illidanlab/abd 4 Jun 2023

Data-free knowledge distillation (KD) helps transfer knowledge from a pre-trained model (known as the teacher model) to a smaller model (known as the student model) without access to the original training data used for training the teacher model.