Paper

Faster feature selection with a Dropping Forward-Backward algorithm

In this era of big data, feature selection techniques, which have long been proven to simplify the model, makes the model more comprehensible, speed up the process of learning, have become more and more important. Among many developed methods, forward and stepwise feature selection regression remained widely used due to their simplicity and efficiency. However, they all involving rescanning all the un-selected features again and again. Moreover, many times, the backward steps in stepwise deem unnecessary, as we will illustrate in our example. These remarks motivate us to introduce a novel algorithm that may boost the speed up to 65.77% compared to the stepwise procedure while maintaining good performance in terms of the number of selected features and error rates. Also, our experiments illustrate that feature selection procedures may be a better choice for high-dimensional problems where the number of features highly exceeds the number of samples.

Results in Papers With Code
(↓ scroll down to see all results)