no code implementations • 6 Apr 2024 • Tin Barisin, Illia Horenko
Convolutional neural networks (CNNs) are reported to be overparametrized.
no code implementations • 29 Oct 2023 • Edoardo Vecchi, Davide Bassetti, Fabio Graziato, Lukas Pospisil, Illia Horenko
As a potential solution to this problem, here we exploit the idea of reducing and rotating the feature space in a lower-dimensional gauge and propose the Gauge-Optimal Approximate Learning (GOAL) algorithm, which provides an analytically tractable joint solution to the dimension reduction, feature segmentation and classification problems for small data learning problems.
no code implementations • 19 Oct 2023 • Illia Horenko
Simply-verifiable mathematical conditions for existence, uniqueness and explicit analytical computation of minimal adversarial paths (MAP) and minimal adversarial distances (MAD) for (locally) uniquely-invertible classifiers, for generalized linear models (GLM), and for entropic AI (EAI) are formulated and proven.
no code implementations • 17 Jun 2023 • Illia Horenko, Lukas Pospisil
In many data science applications, the objective is to extract appropriately-ordered smooth low-dimensional data patterns from high-dimensional data sets.
no code implementations • 22 Dec 2021 • Illia Horenko
Entropic Outlier Sparsification (EOS) is proposed as a robust computational strategy for the detection of data anomalies in a broad class of learning methods, including the unsupervised problems (like detection of non-Gaussian outliers in mostly-Gaussian data) and in the supervised learning with mislabeled data.
no code implementations • 8 Feb 2020 • Illia Horenko
Overfitting and treatment of "small data" are among the most challenging problems in the machine learning (ML), when a relatively small data statistics size $T$ is not enough to provide a robust ML fit for a relatively large data feature dimension $D$.