Search Results for author: Deyi Liu

Found 6 papers, 2 papers with code

Hybrid Variance-Reduced SGD Algorithms For Minimax Problems with Nonconvex-Linear Function

no code implementations NeurIPS 2020 Quoc Tran Dinh, Deyi Liu, Lam Nguyen

We develop a novel and single-loop variance-reduced algorithm to solve a class of stochastic nonconvex-convex minimax problems involving a nonconvex-linear objective function, which has various applications in different fields such as ma- chine learning and robust optimization.

An Optimal Hybrid Variance-Reduced Algorithm for Stochastic Composite Nonconvex Optimization

no code implementations20 Aug 2020 Deyi Liu, Lam M. Nguyen, Quoc Tran-Dinh

In this note we propose a new variant of the hybrid variance-reduced proximal gradient method in [7] to solve a common stochastic composite nonconvex optimization problem under standard assumptions.

Robust and Generalizable Visual Representation Learning via Random Convolutions

2 code implementations ICLR 2021 Zhenlin Xu, Deyi Liu, Junlin Yang, Colin Raffel, Marc Niethammer

In this work, we show that the robustness of neural networks can be greatly improved through the use of random convolutions as data augmentation.

Data Augmentation Domain Generalization +1

Hybrid Variance-Reduced SGD Algorithms For Nonconvex-Concave Minimax Problems

no code implementations NeurIPS 2020 Quoc Tran-Dinh, Deyi Liu, Lam M. Nguyen

This problem class has several computational challenges due to its nonsmoothness, nonconvexity, nonlinearity, and non-separability of the objective functions.

A New Randomized Primal-Dual Algorithm for Convex Optimization with Optimal Last Iterate Rates

no code implementations3 Mar 2020 Quoc Tran-Dinh, Deyi Liu

We develop a novel unified randomized block-coordinate primal-dual algorithm to solve a class of nonsmooth constrained convex optimization problems, which covers different existing variants and model settings from the literature.

A Newton Frank-Wolfe Method for Constrained Self-Concordant Minimization

1 code implementation17 Feb 2020 Deyi Liu, Volkan Cevher, Quoc Tran-Dinh

We demonstrate how to scalably solve a class of constrained self-concordant minimization problems using linear minimization oracles (LMO) over the constraint set.

Experimental Design

Cannot find the paper you are looking for? You can Submit a new open access paper.