no code implementations • ICML 2020 • Yue Sheng, Edgar Dobriban
To scale up data analysis, distributed and parallel computing approaches are increasingly needed.
no code implementations • 16 Jan 2024 • Linghan Zheng, Hui Liu, Xiaojun Lin, Jiayuan Dong, Yue Sheng, Gang Shi, Zhiwei Liu, Hongwei Chen
In previous studies, code-based models have consistently outperformed text-based models in reasoning-intensive scenarios.
no code implementations • 20 Jan 2022 • Yue Sheng, Alnur Ali
Acceleration and momentum are the de facto standard in modern applications of machine learning and optimization, yet the bulk of the work on implicit regularization focuses instead on unaccelerated methods.
1 code implementation • 22 Mar 2019 • Edgar Dobriban, Yue Sheng
Here we study a fundamental and highly important problem in this area: How to do ridge regression in a distributed computing environment?
1 code implementation • 30 Sep 2018 • Edgar Dobriban, Yue Sheng
Here we study the performance loss in estimation, test error, and confidence interval length in high dimensions, where the number of parameters is comparable to the training data size.