no code implementations • 11 Feb 2024 • Jeongyeol Kwon, Dohyun Kwon, Hanbaek Lyu
We study the complexity of finding stationary points with such an $y^*$-aware oracle: we propose a simple first-order method that converges to an $\epsilon$ stationary point using $O(\epsilon^{-6}), O(\epsilon^{-4})$ access to first-order $y^*$-aware oracles.
no code implementations • 6 Dec 2023 • Sangwoong Yoon, Dohyun Kwon, Himchan Hwang, Yung-Kyun Noh, Frank C. Park
We present Generalized Contrastive Divergence (GCD), a novel objective function for training an energy-based model (EBM) and a sampler simultaneously.
no code implementations • 4 Sep 2023 • Jeongyeol Kwon, Dohyun Kwon, Stephen Wright, Robert Nowak
When the perturbed lower-level problem uniformly satisfies the small-error proximal error-bound (EB) condition, we propose a first-order algorithm that converges to an $\epsilon$-stationary point of the penalty function, using in total $O(\epsilon^{-3})$ and $O(\epsilon^{-7})$ accesses to first-order (stochastic) gradient oracles when the oracle is deterministic and oracles are noisy, respectively.
no code implementations • 4 Jun 2023 • Dohyun Kwon, Hanbaek Lyu
We consider the block coordinate descent methods of Gauss-Seidel type with proximal regularization (BCD-PR), which is a classical method of minimizing general nonconvex objectives under constraints that has a wide range of practical applications.
no code implementations • 26 Jan 2023 • Jeongyeol Kwon, Dohyun Kwon, Stephen Wright, Robert Nowak
Specifically, we show that F2SA converges to an $\epsilon$-stationary solution of the bilevel problem after $\epsilon^{-7/2}, \epsilon^{-5/2}$, and $\epsilon^{-3/2}$ iterations (each iteration using $O(1)$ samples) when stochastic noises are in both level objectives, only in the upper-level objective, and not present (deterministic settings), respectively.
1 code implementation • 13 Dec 2022 • Dohyun Kwon, Ying Fan, Kangwook Lee
Specifically, we prove that the Wasserstein distance is upper bounded by the square root of the objective function up to multiplicative constants and a fixed constant offset.
no code implementations • 27 Oct 2021 • Dohyun Kwon, Yeoneung Kim, Guido Montúfar, Insoon Yang
We propose a stable method to train Wasserstein generative adversarial networks.
no code implementations • 8 Jan 2020 • Dohyun Kwon, Joongheon Kim
Millimeter-wave (mmWave) base station can offer abundant high capacity channel resources toward connected vehicles so that quality-of-service (QoS) of them in terms of downlink throughput can be highly improved.