Search Results for author: Divyang Doshi

Found 1 papers, 1 papers with code

ReffAKD: Resource-efficient Autoencoder-based Knowledge Distillation

1 code implementation15 Apr 2024 Divyang Doshi, Jung-eun Kim

In our work, we propose an efficient method for generating these soft labels, thereby eliminating the need for a large teacher model.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.