Search Results for author: Jiankun Liu

Found 3 papers, 0 papers with code

Improved Learning Rates of a Functional Lasso-type SVM with Sparse Multi-Kernel Representation

no code implementations NeurIPS 2021 Shaogao Lv, Junhui Wang, Jiankun Liu, Yong liu

In this paper, we provide theoretical results of estimation bounds and excess risk upper bounds for support vector machine (SVM) with sparse multi-kernel representation.

Effective Distributed Learning with Random Features: Improved Bounds and Algorithms

no code implementations ICLR 2021 Yong liu, Jiankun Liu, Shuqiang Wang

In this paper, we study the statistical properties of distributed kernel ridge regression together with random features (DKRR-RF), and obtain optimal generalization bounds under the basic setting, which can substantially relax the restriction on the number of local machines in the existing state-of-art bounds.

Generalization Bounds

Neural Architecture Optimization with Graph VAE

no code implementations18 Jun 2020 Jian Li, Yong liu, Jiankun Liu, Weiping Wang

The encoder and the decoder belong to a graph VAE, mapping architectures between continuous representations and network architectures.

Computational Efficiency Neural Architecture Search

Cannot find the paper you are looking for? You can Submit a new open access paper.