Search Results for author: Sara S. Baghsorkhi

Found 1 papers, 0 papers with code

DeepThin: A Self-Compressing Library for Deep Neural Networks

no code implementations20 Feb 2018 Matthew Sotoudeh, Sara S. Baghsorkhi

For DeepSpeech, DeepThin-compressed networks achieve better test loss than all other compression methods, reaching a 28% better result than rank factorization, 27% better than pruning, 20% better than hand-tuned same-size networks, and 12% better than HashedNets.

Cannot find the paper you are looking for? You can Submit a new open access paper.