no code implementations • 9 Dec 2022 • Shiyu Liu, Rohan Ghosh, Dylan Tan, Mehul Motani
However, in network pruning, we find that the sparsity introduced by ReLU, which we quantify by a term called dynamic dead neuron rate (DNR), is not beneficial for the pruned network.