feature selection
554 papers with code • 0 benchmarks • 1 datasets
Benchmarks
These leaderboards are used to track progress in feature selection
Libraries
Use these libraries to find feature selection models and implementationsLatest papers
Integrated path stability selection
This yields a tighter bound on E(FP), resulting in a feature selection criterion that has higher sensitivity in practice and is better calibrated in terms of matching the target E(FP).
A Lightweight Attention-based Deep Network via Multi-Scale Feature Fusion for Multi-View Facial Expression Recognition
On the other hand, the PWFS block employs a feature selection mechanism that discards less meaningful features prior to the fusion process.
Explaining deep learning models for ozone pollution prediction via embedded feature selection
Additionally, we tackle the feature selection problem to identify the most relevant features and periods that contribute to prediction accuracy by introducing a novel method called the Time Selection Layer in Deep Learning models, which significantly improves model performance, reduces complexity, and enhances interpretability.
Non-negative Contrastive Learning
In this paper, we propose Non-negative Contrastive Learning (NCL), a renaissance of Non-negative Matrix Factorization (NMF) aimed at deriving interpretable features.
ERASE: Benchmarking Feature Selection Methods for Deep Recommender Systems
Secondly, the existing literature's lack of detailed analysis on selection attributes, based on large-scale datasets and a thorough comparison among selection techniques and DRS backbones, restricts the generalizability of findings and impedes deployment on DRS.
Iterative Feature Boosting for Explainable Speech Emotion Recognition
In speech emotion recognition (SER), using pre- defined features without considering their practical importance may lead to high dimensional datasets, including redundant and irrelevant information.
IGANN Sparse: Bridging Sparsity and Interpretability with Non-linear Insight
In this paper, we propose IGANN Sparse, a novel machine learning model from the family of generalized additive models, which promotes sparsity through a non-linear feature selection process during training.
Open Continual Feature Selection via Granular-Ball Knowledge Transfer
To this end, the proposed CFS method combines the strengths of continual learning (CL) with granular-ball computing (GBC), which focuses on constructing a granular-ball knowledge base to detect unknown classes and facilitate the transfer of previously learned knowledge for further feature selection.
Bridging Domains with Approximately Shared Features
Under our framework, we design and analyze a learning procedure consisting of learning approximately shared feature representation from source tasks and fine-tuning it on the target task.
RealNet: A Feature Selection Network with Realistic Synthetic Anomaly for Anomaly Detection
Self-supervised feature reconstruction methods have shown promising advances in industrial image anomaly detection and localization.