Search Results for author: Quoc Tran Dinh

Found 6 papers, 0 papers with code

FedDR – Randomized Douglas-Rachford Splitting Algorithms for Nonconvex Federated Composite Optimization

no code implementations NeurIPS 2021 Quoc Tran Dinh, Nhan Pham, Dzung Phan, Lam Nguyen

These new algorithms can handle statistical and sys- tem heterogeneity, which are the two main challenges in federated learning, while achieving the best known communication complexity.

Federated Learning

Hybrid Variance-Reduced SGD Algorithms For Minimax Problems with Nonconvex-Linear Function

no code implementations NeurIPS 2020 Quoc Tran Dinh, Deyi Liu, Lam Nguyen

We develop a novel and single-loop variance-reduced algorithm to solve a class of stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective function, which has various applications in different fields such as ma- chine learning and robust optimization.

Transferring Optimality Across Data Distributions via Homotopy Methods

no code implementations ICLR 2020 Matilde Gargiani, Andrea Zanelli, Quoc Tran Dinh, Moritz Diehl, Frank Hutter

Homotopy methods, also known as continuation methods, are a powerful mathematical tool to efficiently solve various problems in numerical analysis, including complex non-convex optimization problems where no or only little prior knowledge regarding the localization of the solutions is available.

A Universal Primal-Dual Convex Optimization Framework

no code implementations NeurIPS 2015 Alp Yurtsever, Quoc Tran Dinh, Volkan Cevher

We propose a new primal-dual algorithmic framework for a prototypical constrained convex optimization template.

Vocal Bursts Type Prediction

An Inexact Proximal Path-Following Algorithm for Constrained Convex Minimization

no code implementations7 Nov 2013 Quoc Tran Dinh, Anastasios Kyrillidis, Volkan Cevher

Many scientific and engineering applications feature nonsmooth convex minimization problems over convex sets.

Cannot find the paper you are looking for? You can Submit a new open access paper.