Sparse Learning
43 papers with code • 3 benchmarks • 3 datasets
Libraries
Use these libraries to find Sparse Learning models and implementationsMost implemented papers
Collaborative Preference Embedding against Sparse Labels
From the margin theory point-of-view, we then propose a generalization enhancement scheme for sparse and insufficient labels via optimizing the margin distribution.
Sparse Weight Activation Training
For ResNet-50 on ImageNet SWAT reduces total floating-point operations (FLOPS) during training by 80% resulting in a 3. 3$\times$ training speedup when run on a simulated sparse learning accelerator representative of emerging platforms while incurring only 1. 63% reduction in validation accuracy.
Picasso: A Sparse Learning Library for High Dimensional Data Analysis in R and Python
We describe a new library named picasso, which implements a unified framework of pathwise coordinate optimization for a variety of sparse learning problems (e. g., sparse linear regression, sparse logistic regression, sparse Poisson regression and scaled sparse linear regression) combined with efficient active set selection strategies.
Fast OSCAR and OWL Regression via Safe Screening Rules
Moreover, we prove that the algorithms with our screening rule are guaranteed to have identical results with the original algorithms.
Event Enhanced High-Quality Image Recovery
To recover high-quality intensity images, one should address both denoising and super-resolution problems for event cameras.
Block-wise Minimization-Majorization algorithm for Huber's criterion: sparse learning and applications
Huber's criterion can be used for robust joint estimation of regression and scale parameters in the linear model.
Accelerated Gradient Methods for Sparse Statistical Learning with Nonconvex Penalties
A recent proposal generalizes Nesterov's AG method to the nonconvex setting.
Thunder: a Fast Coordinate Selection Solver for Sparse Learning
L1 regularization has been broadly employed to pursue model sparsity.
KNN Classification with One-step Computation
In this paper, a one-step computation is proposed to replace the lazy part of KNN classification.
Similarity Preserving Unsupervised Feature Selection based on Sparse Learning
Various feature selection methods have been recently proposed on different applications to reduce the computational burden of machine learning algorithms as well as the complexity of learned models.