no code implementations • 22 Feb 2024 • Andrei V. Konstantinov, Lev V. Utkin
The first idea behind the combination is to form constraints for a joint probability distribution over all combinations of concept values to satisfy the expert rules.
1 code implementation • 19 Feb 2024 • Andrei V. Konstantinov, Stanislav R. Kirpichenko, Lev V. Utkin
A new model for generating survival trajectories and data based on applying an autoencoder of a specific structure is proposed.
1 code implementation • 29 Jan 2024 • Andrei V. Konstantinov, Boris V. Kozlov, Stanislav R. Kirpichenko, Lev V. Utkin
A new approach to the local and global explanation is proposed.
1 code implementation • 11 Dec 2023 • Lev V. Utkin, Danila Y. Eremenko, Andrei V. Konstantinov
A new method called the Survival Beran-based Neural Importance Model (SurvBeNIM) is proposed.
1 code implementation • 7 Aug 2023 • Lev V. Utkin, Danila Y. Eremenko, Andrei V. Konstantinov
For every generated example, the survival function of the black-box model is computed, and the survival function of the surrogate model (the Beran estimator) is constructed as a function of the explanation coefficients.
no code implementations • 19 Jul 2023 • Andrei V. Konstantinov, Lev V. Utkin
A new computationally simple method of imposing hard convex constraints on the neural network output values is proposed.
1 code implementation • 12 Apr 2023 • Andrei V. Konstantinov, Lev V. Utkin, Alexey A. Lukashin, Vladimir A. Muliukha
The main idea behind the proposed NAF model is to introduce the attention mechanism into the random forest by assigning attention weights calculated by neural networks of a specific form to data in leaves of decision trees and to the random forest itself in the framework of the Nadaraya-Watson kernel regression.
1 code implementation • 15 Mar 2023 • Andrei V. Konstantinov, Lev V. Utkin
A new extremely simple ensemble-based model with the uniformly generated axis-parallel hyper-rectangles as base models (HRBM) is proposed.
no code implementations • 13 Feb 2023 • Andrei V. Konstantinov, Lev V. Utkin
The whole STE-MIL model, including soft decision trees, neural networks, the attention mechanism and a classifier, is trained in an end-to-end manner.
1 code implementation • 19 Nov 2022 • Stanislav R. Kirpichenko, Lev V. Utkin, Andrei V. Konstantinov
Instead of typical kernel functions in the Beran estimator, it is proposed to implement kernels in the form of neural networks of a specific form called the neural kernels.
1 code implementation • 11 Oct 2022 • Andrei V. Konstantinov, Lev V. Utkin
The first idea behind the models is to introduce a two-level attention, where one of the levels is the "leaf" attention and the attention mechanism is applied to every leaf of trees.
1 code implementation • 5 Oct 2022 • Lev V. Utkin, Andrey Y. Ageev, Andrei V. Konstantinov
A new modification of Isolation Forest called Attention-Based Isolation Forest (ABIForest) for solving the anomaly detection problem is proposed.
1 code implementation • 19 Jul 2022 • Andrei V. Konstantinov, Stanislav R. Kirpichenko, Lev V. Utkin
The network is trained on controls, and it replaces standard kernels with a set of neural subnetworks with shared parameters such that every subnetwork implements the trainable kernel, but the whole network implements the Nadaraya-Watson estimator.
no code implementations • 9 Jul 2022 • Lev V. Utkin, Andrei V. Konstantinov
New models of random forests jointly using the attention and self-attention mechanisms are proposed for solving the regression problem.
no code implementations • 8 Jan 2022 • Lev V. Utkin, Andrei V. Konstantinov
A new approach called ABRF (the attention-based random forest) and its modifications for applying the attention mechanism to the random forest (RF) for regression and classification are proposed.
no code implementations • 11 Dec 2021 • Andrei V. Konstantinov, Lev V. Utkin
In the method, one of the attention modules takes into account adjacent patches or instances, several attention modules are used to get a diverse feature representation of patches, and one attention module is used to unite different feature representations to provide an accurate classification of each patch (instance) and the whole bag.
1 code implementation • 10 Aug 2021 • Andrei V. Konstantinov, Lev V. Utkin
The first part is a set of the one-feature neural subnetworks which aim to get a specific representation for every feature in the form of a basis of shape functions.
1 code implementation • 16 Jun 2021 • Lev V. Utkin, Andrei V. Konstantinov, Kirill A. Vishniakov
One of the most popular methods of the machine learning prediction explanation is the SHapley Additive exPlanations method (SHAP).
no code implementations • 18 Apr 2021 • Lev V. Utkin, Egor D. Satyukov, Andrei V. Konstantinov
The proposed method SurvNAM allows performing the local and global explanation.
no code implementations • 4 Mar 2021 • Lev V. Utkin, Andrei V. Konstantinov
According to the first modification, called ER-SHAP, several features are randomly selected many times from the feature set, and Shapley values for the features are computed by means of "small" SHAPs.
1 code implementation • 14 Oct 2020 • Andrei V. Konstantinov, Lev V. Utkin
In contrast to the neural additive model, the method provides weights of features in the explicit form, and it is simply trained.
no code implementations • 12 Oct 2020 • Andrei V. Konstantinov, Lev V. Utkin
The main idea behind the approach is to use the stacking algorithm in order to learn a second-level meta-model which can be regarded as a model for implementing various ensembles of gradient boosting models.
no code implementations • 26 Jun 2020 • Maxim S. Kovalev, Lev V. Utkin
A method for counterfactual explanation of machine learning survival models is proposed.
no code implementations • 19 Jun 2020 • Andrei V. Konstantinov, Lev V. Utkin
The gradient boosting machine is a powerful ensemble-based machine learning method for solving regression problems.
no code implementations • 5 May 2020 • Lev V. Utkin, Maxim S. Kovalev, Ernest M. Kasimov
A new modification of the explanation method SurvLIME called SurvLIME-Inf for explaining machine learning survival models is proposed.
no code implementations • 5 May 2020 • Maxim S. Kovalev, Lev V. Utkin
As a result, the robust maximin strategy is used, which aims to minimize the average distance between cumulative hazard functions of the explained black-box model and of the approximating Cox model, and to maximize the distance over all cumulative hazard functions in the interval produced by the Kolmogorov-Smirnov bounds.
no code implementations • 18 Mar 2020 • Maxim S. Kovalev, Lev V. Utkin, Ernest M. Kasimov
A new method called SurvLIME for explaining machine learning survival models is proposed.
no code implementations • 18 Nov 2019 • Lev V. Utkin, Maxim S. Kovalev, Ernest M. Kasimov
A new method for explaining the Siamese neural network is proposed.
no code implementations • 9 Sep 2019 • Lev V. Utkin, Mikhail V. Kots, Viacheslav S. Chukanov
A new meta-algorithm for estimating the conditional average treatment effects is proposed in the paper.
no code implementations • 4 Jan 2019 • Lev V. Utkin, Andrei V. Konstantinov, Viacheslav S. Chukanov, Mikhail V. Kots, Anna A. Meldo
The idea underlying the modification is very simple and stems from the confidence screening mechanism idea proposed by Pang et al. to simplify the Deep Forest classifier by means of updating the training set at each level in accordance with the classification accuracy of every training instance.
no code implementations • 1 Jan 2019 • Lev V. Utkin, Andrei V. Konstantinov, Viacheslav S. Chukanov, Mikhail V. Kots, Mikhail A. Ryabinin, Anna A. Meldo
A weighted random survival forest is presented in the paper.
no code implementations • 5 Aug 2017 • Lev V. Utkin, Irina L. Utkina
A computationally simple genome-wide association study (GWAS) algorithm for estimating the main and epistatic effects of markers or single nucleotide polymorphisms (SNPs) is proposed.
no code implementations • 25 May 2017 • Lev V. Utkin, Mikhail A. Ryabinin
A Discriminative Deep Forest (DisDF) as a metric learning algorithm is proposed in the paper.
no code implementations • 27 Apr 2017 • Lev V. Utkin, Mikhail A. Ryabinin
A Siamese Deep Forest (SDF) is proposed in the paper.