1 code implementation • 15 Feb 2024 • Zhiwei Tang, Tsung-Hui Chang
In Federated Learning (FL), a framework to train machine learning models across distributed data, well-known algorithms like FedAvg tend to have slow convergence rates, resulting in high communication costs during training.
no code implementations • 15 Feb 2024 • Zhiwei Tang, Jiasheng Tang, Hao Luo, Fan Wang, Tsung-Hui Chang
Our experiments demonstrate that ParaTAA can decrease the inference steps required by common sequential sampling algorithms such as DDIM and DDPM by a factor of 4~14 times.
1 code implementation • 7 Mar 2023 • Zhiwei Tang, Dmitry Rybin, Tsung-Hui Chang
In this study, we delve into an emerging optimization challenge involving a black-box objective function that can only be gauged via a ranking oracle-a situation frequently encountered in real-world scenarios, especially when the function is evaluated by human judges.
no code implementations • 6 Feb 2023 • Zhiwei Tang, Yanmeng Wang, Tsung-Hui Chang
In this paper, we propose a novel noisy perturbation scheme with a general symmetric noise distribution for sign-based compression, which not only allows one to flexibly control the tradeoff between gradient bias and convergence performance, but also provides a unified viewpoint to existing stochastic sign-based methods.
no code implementations • 15 Oct 2021 • Zhiwei Tang, Tsung-Hui Chang, Xiaojing Ye, Hongyuan Zha
We study a matrix recovery problem with unknown correspondence: given the observation matrix $M_o=[A,\tilde P B]$, where $\tilde P$ is an unknown permutation matrix, we aim to recover the underlying matrix $M=[A, B]$.
no code implementations • 10 Sep 2017 • Xiaodong Feng, Zhiwei Tang, Sen Wu
Sparse coding (SC) is attracting more and more attention due to its comprehensive theoretical studies and its excellent performance in many signal processing applications.