Search Results for author: Ionut-Vlad Modoranu

Found 1 papers, 1 papers with code

Error Feedback Can Accurately Compress Preconditioners

1 code implementation9 Jun 2023 Ionut-Vlad Modoranu, Aleksei Kalinov, Eldar Kurtic, Elias Frantar, Dan Alistarh

Experiments on deep neural networks show that this approach can compress full-matrix preconditioners to up to 99\% sparsity without accuracy loss, effectively removing the memory overhead of full-matrix preconditioners such as GGT and M-FAC.

Classification Second-order methods

Cannot find the paper you are looking for? You can Submit a new open access paper.