Search Results for author: Ujan Deb

Found 1 papers, 0 papers with code

AdaKD: Dynamic Knowledge Distillation of ASR models using Adaptive Loss Weighting

no code implementations11 May 2024 Shreyan Ganguly, Roshan Nayak, Rakshith Rao, Ujan Deb, Prathosh AP

Knowledge distillation, a widely used model compression technique, works on the basis of transferring knowledge from a cumbersome teacher model to a lightweight student model.

Knowledge Distillation Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.