Search Results for author: Manas Mohanty

Found 1 papers, 0 papers with code

What is Lost in Knowledge Distillation?

no code implementations7 Nov 2023 Manas Mohanty, Tanya Roosta, Peyman Passban

Deep neural networks (DNNs) have improved NLP tasks significantly, but training and maintaining such networks could be costly.

Knowledge Distillation Model Compression

Cannot find the paper you are looking for? You can Submit a new open access paper.