Ensemble Pruning
11 papers with code • 0 benchmarks • 0 datasets
Benchmarks
These leaderboards are used to track progress in Ensemble Pruning
Libraries
Use these libraries to find Ensemble Pruning models and implementationsMost implemented papers
The Shapley Value of Classifiers in Ensemble Games
We argue that the Shapley value of models in these games is an effective decision metric for choosing a high performing subset of models from the ensemble.
The Shapley Value in Machine Learning
Over the last few years, the Shapley value, a solution concept from cooperative game theory, has found numerous applications in machine learning.
Robust Few-Shot Ensemble Learning with Focal Diversity-Based Pruning
This paper presents FusionShot, a focal diversity optimized few-shot ensemble learning approach for boosting the robustness and generalization performance of pre-trained few-shot models.
Ensemble Pruning based on Objection Maximization with a General Distributed Framework
Ensemble pruning, selecting a subset of individual learners from an original ensemble, alleviates the deficiencies of ensemble learning on the cost of time and space.
Sub-Architecture Ensemble Pruning in Neural Architecture Search
Neural architecture search (NAS) is gaining more and more attention in recent years due to its flexibility and remarkable capability to reduce the burden of neural network design.
Boosting Ensemble Accuracy by Revisiting Ensemble Diversity Metrics
Our new metrics significantly improve the intrinsic correlation between high ensemble diversity and high ensemble accuracy.
Improving the Accuracy-Memory Trade-Off of Random Forests Via Leaf-Refinement
In this paper, we revisit ensemble pruning in the context of `modernly' trained Random Forests where trees are very large.
Conceptually Diverse Base Model Selection for Meta-Learners in Concept Drifting Data Streams
Our results show that conceptual similarity thresholding has a reduced computational overhead, and yet yields comparable predictive performance to thresholding using predictive performance and MI.
Boosting Deep Ensemble Performance with Hierarchical Pruning
Evaluated using two benchmark datasets, we show that the proposed focal diversity powered hierarchical pruning can find significantly smaller ensembles of deep neural network models while achieving the same or better classification generalizability.
Autoselection of the Ensemble of Convolutional Neural Networks with Second-Order Cone Programming
Ensemble techniques are frequently encountered in machine learning and engineering problems since the method combines different models and produces an optimal predictive solution.