Search Results for author: Dylan Tan

Found 1 papers, 0 papers with code

AP: Selective Activation for De-sparsifying Pruned Neural Networks

no code implementations9 Dec 2022 Shiyu Liu, Rohan Ghosh, Dylan Tan, Mehul Motani

However, in network pruning, we find that the sparsity introduced by ReLU, which we quantify by a term called dynamic dead neuron rate (DNR), is not beneficial for the pruned network.

Network Pruning

Cannot find the paper you are looking for? You can Submit a new open access paper.