Search Results for author: Yilin Kang

Found 6 papers, 1 papers with code

Sharper Utility Bounds for Differentially Private Models

no code implementations22 Apr 2022 Yilin Kang, Yong liu, Jian Li, Weiping Wang

In this paper, by introducing Generalized Bernstein condition, we propose the first $\mathcal{O}\big(\frac{\sqrt{p}}{n\epsilon}\big)$ high probability excess population risk bound for differentially private algorithms under the assumptions $G$-Lipschitz, $L$-smooth, and Polyak-{\L}ojasiewicz condition, based on gradient perturbation method.

Stability and Generalization of Differentially Private Minimax Problems

no code implementations11 Apr 2022 Yilin Kang, Yong liu, Jian Li, Weiping Wang

To the best of our knowledge, this is the first time to analyze the generalization performance of general minimax paradigm, taking differential privacy into account.

Towards Sharper Utility Bounds for Differentially Private Pairwise Learning

no code implementations7 May 2021 Yilin Kang, Yong liu, Jian Li, Weiping Wang

Pairwise learning focuses on learning tasks with pairwise loss functions, depends on pairs of training instances, and naturally fits for modeling relationships between pairs of samples.

Data Heterogeneity Differential Privacy: From Theory to Algorithm

no code implementations20 Feb 2020 Yilin Kang, Jian Li, Yong liu, Weiping Wang

Traditionally, the random noise is equally injected when training with different data instances in the field of differential privacy (DP).

BIG-bench Machine Learning

Input Perturbation: A New Paradigm between Central and Local Differential Privacy

1 code implementation20 Feb 2020 Yilin Kang, Yong liu, Ben Niu, Xin-Yi Tong, Likun Zhang, Weiping Wang

By adding noise to the original training data and training with the `perturbed data', we achieve ($\epsilon$,$\delta$)-differential privacy on the final model, along with some kind of privacy on the original data.

Weighted Distributed Differential Privacy ERM: Convex and Non-convex

no code implementations23 Oct 2019 Yilin Kang, Yong liu, Weiping Wang

By detailed theoretical analysis, we show that in distributed setting, the noise bound and the excess empirical risk bound can be improved by considering different weights held by multiple parties.

Cannot find the paper you are looking for? You can Submit a new open access paper.