no code implementations • 15 Apr 2024 • Daniil Merkulov, Daria Cherniuk, Alexander Rudikov, Ivan Oseledets, Ekaterina Muravleva, Aleksandr Mikhalev, Boris Kashin
In this paper, we introduce an algorithm for data quantization based on the principles of Kashin representation.
2 code implementations • 29 Sep 2022 • Valentin Leplat, Daniil Merkulov, Aleksandr Katrutsa, Daniel Bershatsky, Olga Tsymboi, Ivan Oseledets
Classical machine learning models such as deep neural networks are usually trained by using Stochastic Gradient Descent-based (SGD) algorithms.
2 code implementations • 31 Jan 2022 • Daniel Bershatsky, Aleksandr Mikhalev, Alexandr Katrutsa, Julia Gusak, Daniil Merkulov, Ivan Oseledets
Also, we investigate the variance of the gradient estimate induced by the randomized matrix multiplication.
no code implementations • 2 Oct 2021 • Andrey Filatov, Daniil Merkulov
But, usually, line search for the step size is not the best choice due to the large computational time overhead.
1 code implementation • Computational Science–ICCS 2021: 21st International Conference, Krakow, Poland, 2021 • Mikhail Gasanov, Daniil Merkulov, Artyom Nikitin, Sergey Matveev, Nikita Stasenko, Anna Petrovskaia, Mariia Pukalchik & Ivan Oseledets
Finding optimal irrigation and water resources for crops is necessary to increase the efficiency of water usage.
1 code implementation • 14 Jul 2020 • Alexandr Katrutsa, Daniil Merkulov, Nurislam Tursynbek, Ivan Oseledets
This descent direction is based on the normalized gradients of the individual losses.
no code implementations • ICLR Workshop DeepDiffEq 2019 • Daniil Merkulov, Ivan Oseledets
We present a different view on stochastic optimization, which goes back to the splitting schemes for approximate solutions of ODE.
no code implementations • 14 Jun 2019 • Daniil Merkulov, Ivan Oseledets
In this paper we propose a method of obtaining points of extreme overfitting - parameters of modern neural networks, at which they demonstrate close to 100 % training accuracy, simultaneously with almost zero accuracy on the test sample.