Sparse Learning

44 papers with code • 3 benchmarks • 3 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Sparse Learning models and implementations

L0Learn: A Scalable Package for Sparse Learning using L0 Regularization

hazimehh/L0Learn 10 Feb 2022

We present L0Learn: an open-source package for sparse linear regression and classification using $\ell_0$ regularization.

93
10 Feb 2022

SL-CycleGAN: Blind Motion Deblurring in Cycles using Sparse Learning

jammer345/SL-CycleGAN-Blind-Motion-Deblurring-in-Cycles-using-Sparse-Learning 7 Nov 2021

In this paper, we introduce an end-to-end generative adversarial network (GAN) based on sparse learning for single image blind motion deblurring, which we called SL-CycleGAN.

9
07 Nov 2021

Learning where to learn: Gradient sparsity in meta and continual learning

johswald/learning_where_to_learn NeurIPS 2021

We find that patterned sparsity emerges from this process, with the pattern of sparsity varying on a problem-by-problem basis.

38
27 Oct 2021

abess: A Fast Best Subset Selection Library in Python and R

abess-team/abess 19 Oct 2021

In addition, a user-friendly R library is available at the Comprehensive R Archive Network.

426
19 Oct 2021

Sparse Training via Boosting Pruning Plasticity with Neuroregeneration

vita-group/granet NeurIPS 2021

Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of attention currently on post-training pruning (iterative magnitude pruning), and before-training pruning (pruning at initialization).

28
19 Jun 2021

Grouped Variable Selection with Discrete Optimization: Computational and Statistical Perspectives

hazimehh/l0group 14 Apr 2021

Our algorithmic framework consists of approximate and exact algorithms.

9
14 Apr 2021

Do We Actually Need Dense Over-Parameterization? In-Time Over-Parameterization in Sparse Training

Shiweiliuiiiiiii/In-Time-Over-Parameterization 4 Feb 2021

By starting from a random sparse network and continuously exploring sparse connectivities during training, we can perform an Over-Parameterization in the space-time manifold, closing the gap in the expressibility between sparse training and dense training.

46
04 Feb 2021

Similarity Preserving Unsupervised Feature Selection based on Sparse Learning

mohsengh/SLSP 10th International Symposium on Telecommunications (IST) 2020

Various feature selection methods have been recently proposed on different applications to reduce the computational burden of machine learning algorithms as well as the complexity of learned models.

3
15 Dec 2020

KNN Classification with One-step Computation

lijy207/one-step-knn 9 Dec 2020

In this paper, a one-step computation is proposed to replace the lazy part of KNN classification.

2
09 Dec 2020

Thunder: a Fast Coordinate Selection Solver for Sparse Learning

ShaogangRen/Thunder NeurIPS 2020

L1 regularization has been broadly employed to pursue model sparsity.

0
01 Dec 2020