feature selection
549 papers with code • 0 benchmarks • 1 datasets
Benchmarks
These leaderboards are used to track progress in feature selection
Libraries
Use these libraries to find feature selection models and implementationsLatest papers
PEACH: Pretrained-embedding Explanation Across Contextual and Hierarchical Structure
In this work, we propose a novel tree-based explanation technique, PEACH (Pretrained-embedding Explanation Across Contextual and Hierarchical Structure), that can explain how text-based documents are classified by using any pretrained contextual embeddings in a tree-based human-interpretable manner.
ALICE: Combining Feature Selection and Inter-Rater Agreeability for Machine Learning Insights
This paper presents a new Python library called Automated Learning for Insightful Comparison and Evaluation (ALICE), which merges conventional feature selection and the concept of inter-rater agreeability in a simple, user-friendly manner to seek insights into black box Machine Learning models.
Quiver Laplacians and Feature Selection
The challenge of selecting the most relevant features of a given dataset arises ubiquitously in data analysis and dimensionality reduction.
The CAST package for training and assessment of spatial prediction models in R
One key task in environmental science is to map environmental variables continuously in space or even in space and time.
Exhaustive Exploitation of Nature-inspired Computation for Cancer Screening in an Ensemble Manner
This study presents a framework termed Evolutionary Optimized Diverse Ensemble Learning (EODE) to improve ensemble learning for cancer classification from gene expression data.
DeepLINK-T: deep learning inference for time series data using knockoffs and LSTM
DeepLINK-T combines deep learning with knockoff inference to control FDR in feature selection for time series models, accommodating a wide variety of feature distributions.
Integrated path stability selection
This yields a tighter bound on E(FP), resulting in a feature selection criterion that has higher sensitivity in practice and is better calibrated in terms of matching the target E(FP).
A Lightweight Attention-based Deep Network via Multi-Scale Feature Fusion for Multi-View Facial Expression Recognition
On the other hand, the PWFS block employs a feature selection mechanism that discards less meaningful features prior to the fusion process.
Explaining deep learning models for ozone pollution prediction via embedded feature selection
Additionally, we tackle the feature selection problem to identify the most relevant features and periods that contribute to prediction accuracy by introducing a novel method called the Time Selection Layer in Deep Learning models, which significantly improves model performance, reduces complexity, and enhances interpretability.
Non-negative Contrastive Learning
In this paper, we propose Non-negative Contrastive Learning (NCL), a renaissance of Non-negative Matrix Factorization (NMF) aimed at deriving interpretable features.