no code implementations • 28 Oct 2023 • Aleksandar Armacki, Pranay Sharma, Gauri Joshi, Dragana Bajovic, Dusan Jakovetic, Soummya Kar
First, for non-convex costs and component-wise nonlinearities, we establish a convergence rate arbitrarily close to $\mathcal{O}\left(t^{-\frac{1}{4}}\right)$, whose exponent is independent of noise and problem parameters.
no code implementations • 2 Nov 2022 • Dragana Bajovic, Dusan Jakovetic, Soummya Kar
In this work we provide a formal framework for the study of general high probability bounds with SGD, based on the theory of large deviations.
no code implementations • 22 Sep 2022 • Aleksandar Armacki, Dragana Bajovic, Dusan Jakovetic, Soummya Kar
In the proposed setup, the grouping of users (based on the data distributions they sample), as well as the underlying statistical properties of the distributions, are apriori unknown.
no code implementations • 6 Apr 2022 • Dusan Jakovetic, Dragana Bajovic, Anit Kumar Sahu, Soummya Kar, Nemanja Milosevic, Dusan Stamenkovic
We introduce a general framework for nonlinear stochastic gradient descent (SGD) for the scenarios when gradient noise exhibits heavy tails.
no code implementations • 1 Feb 2022 • Aleksandar Armacki, Dragana Bajovic, Dusan Jakovetic, Soummya Kar
We propose a general approach for distance based clustering, using the gradient of the cost function that measures clustering quality with respect to cluster assignments and cluster center positions.
no code implementations • 1 Feb 2022 • Aleksandar Armacki, Dragana Bajovic, Dusan Jakovetic, Soummya Kar
The proposed framework is based on a generalization of convex clustering in which the differences between different users' models are penalized via a sum-of-norms penalty, weighted by a penalty parameter $\lambda$.
no code implementations • 17 Sep 2021 • Stevo Rackovic, Claudia Soares, Dusan Jakovetic, Zoranka Desnica
Our approach is model-based, but in contrast with previous model-based approaches, we use a quadratic instead of the linear approximation to the higher order rig model.
no code implementations • 17 Feb 2021 • Milos Savic, Milan Lukic, Dragan Danilovic, Zarko Bodroski, Dragana Bajovic, Ivan Mezei, Dejan Vukobratovic, Srdjan Skrbic, Dusan Jakovetic
The number of connected Internet of Things (IoT) devices within cyber-physical infrastructure systems grows at an increasing rate.
Anomaly Detection Networking and Internet Architecture
no code implementations • 18 Dec 2019 • Dusan Jakovetic, Dragana Bajovic, Joao Xavier, Jose M. F. Moura
The augmented Lagrangian method (ALM) is a classical optimization tool that solves a given "difficult" (constrained) problem via finding solutions of a sequence of "easier"(often unconstrained) sub-problems with respect to the original (primal) variable, wherein constraints satisfaction is controlled via the so-called dual variables.
Optimization and Control Information Theory Information Theory
no code implementations • 21 Jan 2019 • Ran Xin, Dusan Jakovetic, Usman A. Khan
In this letter, we introduce a distributed Nesterov method, termed as $\mathcal{ABN}$, that does not require doubly-stochastic weight matrices.