no code implementations • NeurIPS 2021 • Quoc Tran Dinh, Nhan Pham, Dzung Phan, Lam Nguyen
These new algorithms can handle statistical and sys- tem heterogeneity, which are the two main challenges in federated learning, while achieving the best known communication complexity.
no code implementations • NeurIPS 2020 • Quoc Tran Dinh, Deyi Liu, Lam Nguyen
We develop a novel and single-loop variance-reduced algorithm to solve a class of stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective function, which has various applications in different fields such as ma- chine learning and robust optimization.
no code implementations • ICLR 2020 • Matilde Gargiani, Andrea Zanelli, Quoc Tran Dinh, Moritz Diehl, Frank Hutter
Homotopy methods, also known as continuation methods, are a powerful mathematical tool to efficiently solve various problems in numerical analysis, including complex non-convex optimization problems where no or only little prior knowledge regarding the localization of the solutions is available.
no code implementations • NeurIPS 2018 • Quoc Tran Dinh
Our algorithms have several new features compared to existing methods.
no code implementations • NeurIPS 2015 • Alp Yurtsever, Quoc Tran Dinh, Volkan Cevher
We propose a new primal-dual algorithmic framework for a prototypical constrained convex optimization template.
no code implementations • 7 Nov 2013 • Quoc Tran Dinh, Anastasios Kyrillidis, Volkan Cevher
Many scientific and engineering applications feature nonsmooth convex minimization problems over convex sets.