1 code implementation • 10 Aug 2021 • Andrei Patrascu, Paul Irofti
Several decades ago the Proximal Point Algorithm (PPA) started to gain a long-lasting attraction for both abstract operator theory and numerical optimization communities.
no code implementations • 30 Mar 2020 • Andrei Patrascu, Ciprian Paduraru, Paul Irofti
Stochastic optimization lies at the core of most statistical learning models.
1 code implementation • 4 Dec 2019 • Andrei Patrascu, Paul Irofti
In the large-scale or noisy contexts, when only stochastic information on the smooth part of the objective function is available, the extension of proximal gradient schemes to stochastic oracles is based on proximal tractability of the nonsmooth component and it has been deeply analyzed in the literature.
no code implementations • 24 Oct 2019 • Paul Irofti, Andrei Patrascu, Andra Baltoiu
In general, anomaly detection is the problem of distinguishing between normal data samples with well defined patterns or signatures and those that do not conform to the expected profiles.
no code implementations • 24 Oct 2019 • Andra Baltoiu, Andrei Patrascu, Paul Irofti
Anomaly detection in networks often boils down to identifying an underlying graph structure on which the abnormal occurrence rests on.
no code implementations • 22 Jan 2019 • Andrei Patrascu
Inspired by the scalability of alternating projection methods, we start from the (linear) regularity assumption, typically used in convex feasiblity problems to guarantee the linear convergence of stochastic alternating projection methods, and analyze a general weak linear regularity condition which facilitates convergence rate boosts in stochastic proximal point schemes.