AutoML

Automatic Search for Parsimonious Models

The principle of parsimony, also known as Occam's razor, elucidates the preference for the simplest explanation that provides optimal results when faced with multiple options. Thus, we can assert that the principle of parsimony is justified by "the assumption that is both the simplest and contains all the necessary information required to comprehend the experiment at hand." This principle finds application in various scenarios or events in our daily lives, including predictions in Data Science models.

It is widely recognized that a less complex model will produce more stable predictions, exhibit greater resilience to noise and disturbances, and be more manageable for maintenance and analysis. Additionally, reducing the number of features can lead to further cost savings by diminishing the use of sensors, lowering energy consumption, minimizing information acquisition costs, reducing maintenance requirements, and mitigating the necessity to retrain models due to feature fluctuations caused by noise, outliers, data drift, etc.

The concurrent optimization of hyperparameters (HO) and feature selection (FS) for achieving Parsimonious Model Selection (PMS) is an ongoing area of active research. Nonetheless, the effective selection of appropriate hyperparameters and feature subsets presents a challenging combinatorial problem, frequently requiring the application of efficient heuristic methods.

Papers


Paper Code Results Date Stars

Tasks


Task Papers Share
Feature Engineering 1 50.00%
Bayesian Optimization 1 50.00%

Components


Component Type
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories