Search Results for author: Olga Tsymboi

Found 5 papers, 3 papers with code

Towards Foundation Time Series Model: To Synthesize Or Not To Synthesize?

1 code implementation4 Mar 2024 Kseniia Kuvshinova, Olga Tsymboi, Alina Kostromina, Dmitry Simakov, Elizaveta Kovtun

In this work, we consider the essential question if it is advantageous to train a foundation model on synthetic data or it is better to utilize only a limited number of real-life examples.

Time Series

Sparse and Transferable Universal Singular Vectors Attack

no code implementations25 Jan 2024 Kseniia Kuvshinova, Olga Tsymboi, Ivan Oseledets

The research in the field of adversarial attacks and models' vulnerability is one of the fundamental directions in modern machine learning.

Adversarial Attack

General Lipschitz: Certified Robustness Against Resolvable Semantic Transformations via Transformation-Dependent Randomized Smoothing

no code implementations17 Aug 2023 Dmitrii Korzh, Mikhail Pautov, Olga Tsymboi, Ivan Oseledets

Randomized smoothing is the state-of-the-art approach to construct image classifiers that are provably robust against additive adversarial perturbations of bounded magnitude.

Translation

Translate your gibberish: black-box adversarial attack on machine translation systems

1 code implementation20 Mar 2023 Andrei Chertkov, Olga Tsymboi, Mikhail Pautov, Ivan Oseledets

Neural networks are deployed widely in natural language processing tasks on the industrial scale, and perhaps the most often they are used as compounds of automatic machine translation systems.

Adversarial Attack Machine Translation +1

NAG-GS: Semi-Implicit, Accelerated and Robust Stochastic Optimizer

2 code implementations29 Sep 2022 Valentin Leplat, Daniil Merkulov, Aleksandr Katrutsa, Daniel Bershatsky, Olga Tsymboi, Ivan Oseledets

Classical machine learning models such as deep neural networks are usually trained by using Stochastic Gradient Descent-based (SGD) algorithms.

Cannot find the paper you are looking for? You can Submit a new open access paper.