Sparse Learning
44 papers with code • 3 benchmarks • 3 datasets
Libraries
Use these libraries to find Sparse Learning models and implementationsLatest papers
L0Learn: A Scalable Package for Sparse Learning using L0 Regularization
We present L0Learn: an open-source package for sparse linear regression and classification using $\ell_0$ regularization.
SL-CycleGAN: Blind Motion Deblurring in Cycles using Sparse Learning
In this paper, we introduce an end-to-end generative adversarial network (GAN) based on sparse learning for single image blind motion deblurring, which we called SL-CycleGAN.
Learning where to learn: Gradient sparsity in meta and continual learning
We find that patterned sparsity emerges from this process, with the pattern of sparsity varying on a problem-by-problem basis.
abess: A Fast Best Subset Selection Library in Python and R
In addition, a user-friendly R library is available at the Comprehensive R Archive Network.
Sparse Training via Boosting Pruning Plasticity with Neuroregeneration
Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of attention currently on post-training pruning (iterative magnitude pruning), and before-training pruning (pruning at initialization).
Grouped Variable Selection with Discrete Optimization: Computational and Statistical Perspectives
Our algorithmic framework consists of approximate and exact algorithms.
Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training
By starting from a random sparse network and continuously exploring sparse connectivities during training, we can perform an Over-Parameterization in the space-time manifold, closing the gap in the expressibility between sparse training and dense training.
Similarity Preserving Unsupervised Feature Selection based on Sparse Learning
Various feature selection methods have been recently proposed on different applications to reduce the computational burden of machine learning algorithms as well as the complexity of learned models.
KNN Classification with One-step Computation
In this paper, a one-step computation is proposed to replace the lazy part of KNN classification.
Thunder: a Fast Coordinate Selection Solver for Sparse Learning
L1 regularization has been broadly employed to pursue model sparsity.