Search Results for author: Yuling Jiao

Found 40 papers, 4 papers with code

Latent Schr{ö}dinger Bridge Diffusion Model for Generative Learning

no code implementations20 Apr 2024 Yuling Jiao, Lican Kang, Huazhen Lin, Jin Liu, Heng Zuo

Our theoretical analysis encompasses the establishment of end-to-end error analysis for learning distributions via the latent Schr{\"o}dinger bridge diffusion model.

Convergence Analysis of Flow Matching in Latent Space with Transformers

no code implementations3 Apr 2024 Yuling Jiao, Yanming Lai, Yang Wang, Bokai Yan

We present theoretical convergence guarantees for ODE-based generative models, specifically flow matching.

Convergence of Continuous Normalizing Flows for Learning Probability Distributions

no code implementations31 Mar 2024 Yuan Gao, Jian Huang, Yuling Jiao, Shurong Zheng

We establish non-asymptotic error bounds for the distribution estimator based on CNFs, in terms of the Wasserstein-2 distance.

Image Generation Protein Structure Prediction

Deep Conditional Generative Learning: Model and Error Analysis

1 code implementation2 Feb 2024 Jinyuan Chang, Zhao Ding, Yuling Jiao, Ruoxuan Li, Jerry Zhijian Yang

We introduce an Ordinary Differential Equation (ODE) based deep generative method for learning a conditional distribution, named the Conditional Follmer Flow.

Density Estimation

Semi-Supervised Deep Sobolev Regression: Estimation, Variable Selection and Beyond

no code implementations9 Jan 2024 Zhao Ding, Chenguang Duan, Yuling Jiao, Jerry Zhijian Yang

We propose SDORE, a semi-supervised deep Sobolev regressor, for the nonparametric estimation of the underlying regression function and its gradient.

regression Variable Selection

Neural Network Approximation for Pessimistic Offline Reinforcement Learning

no code implementations19 Dec 2023 Di wu, Yuling Jiao, Li Shen, Haizhao Yang, Xiliang Lu

In this paper, we establish a non-asymptotic estimation error of pessimistic offline RL using general neural network approximation with $\mathcal{C}$-mixing data regarding the structure of networks, the dimension of datasets, and the concentrability of data coverage, under mild assumptions.

Offline RL reinforcement-learning +1

Gaussian Interpolation Flows

no code implementations20 Nov 2023 Yuan Gao, Jian Huang, Yuling Jiao

Gaussian denoising has emerged as a powerful principle for constructing simulation-free continuous normalizing flows for generative modeling.

Denoising

Provable Advantage of Parameterized Quantum Circuit in Function Approximation

no code implementations11 Oct 2023 Zhan Yu, Qiuhao Chen, Yuling Jiao, Yinan Li, Xiliang Lu, Xin Wang, Jerry Zhijian Yang

To achieve this, we utilize techniques from quantum signal processing and linear combinations of unitaries to construct PQCs that implement multivariate polynomials.

Quantum Machine Learning

Non-Asymptotic Bounds for Adversarial Excess Risk under Misspecified Models

no code implementations2 Sep 2023 Changyu Liu, Yuling Jiao, Junhui Wang, Jian Huang

For the quadratic loss in nonparametric regression, we show that the adversarial excess risk bound can be improved over those for a general loss.

Adversarial Attack regression

Current density impedance imaging with PINNs

no code implementations24 Jun 2023 Chenguang Duan, Yuling Jiao, Xiliang Lu, Jerry Zhijian Yang

In this paper, we introduce CDII-PINNs, a computationally efficient method for solving CDII using PINNs in the framework of Tikhonov regularization.

Differentiable Neural Networks with RePU Activation: with Applications to Score Estimation and Isotonic Regression

no code implementations1 May 2023 Guohao Shen, Yuling Jiao, Yuanyuan Lin, Jian Huang

We establish error bounds for simultaneously approximating $C^s$ smooth functions and their derivatives using RePU-activated deep neural networks.

regression

GAS: A Gaussian Mixture Distribution-Based Adaptive Sampling Method for PINNs

no code implementations28 Mar 2023 Yuling Jiao, Di Li, Xiliang Lu, Jerry Zhijian Yang, Cheng Yuan

With the recent study of deep learning in scientific computation, the Physics-Informed Neural Networks (PINNs) method has drawn widespread attention for solving Partial Differential Equations (PDEs).

Incremental Learning

Convergence Analysis of the Deep Galerkin Method for Weak Solutions

no code implementations5 Feb 2023 Yuling Jiao, Yanming Lai, Yang Wang, Haizhao Yang, Yunfei Yang

This paper analyzes the convergence rate of a deep Galerkin method for the weak solution (DGMW) of second-order elliptic partial differential equations on $\mathbb{R}^d$ with Dirichlet, Neumann, and Robin boundary conditions, respectively.

Estimation of Non-Crossing Quantile Regression Process with Deep ReQU Neural Networks

no code implementations21 Jul 2022 Guohao Shen, Yuling Jiao, Yuanyuan Lin, Joel L. Horowitz, Jian Huang

We propose a penalized nonparametric approach to estimating the quantile regression process (QRP) in a nonseparable model using rectifier quadratic unit (ReQU) activated deep neural networks and introduce a novel penalty function to enforce non-crossing of quantile regression curves.

regression

Efficient and practical quantum compiler towards multi-qubit systems with deep reinforcement learning

no code implementations14 Apr 2022 Qiuhao Chen, Yuxuan Du, Qi Zhao, Yuling Jiao, Xiliang Lu, Xingyao Wu

We systematically evaluate the performance of our proposal in compiling quantum operators with both inverse-closed and inverse-free universal basis sets.

Q-Learning reinforcement-learning +1

Approximation bounds for norm constrained neural networks with applications to regression and GANs

no code implementations24 Jan 2022 Yuling Jiao, Yang Wang, Yunfei Yang

This paper studies the approximation capacity of ReLU neural networks with norm constraint on the weights.

regression

Wasserstein Generative Learning of Conditional Distribution

1 code implementation19 Dec 2021 Shiao Liu, Xingyu Zhou, Yuling Jiao, Jian Huang

The proposed approach uses a conditional generator to transform a known distribution to the target conditional distribution.

Density Estimation Image Generation +2

Just Least Squares: Binary Compressive Sampling with Low Generative Intrinsic Dimension

no code implementations29 Nov 2021 Yuling Jiao, Dingwei Li, Min Liu, Xiangliang Lu, Yuanyuan Yang

In this paper, we consider recovering $n$ dimensional signals from $m$ binary measurements corrupted by noises and sign flips under the assumption that the target signals have low generative intrinsic dimension, i. e., the target signals can be approximately generated via an $L$-Lipschitz generator $G: \mathbb{R}^k\rightarrow\mathbb{R}^{n}, k\ll n$.

A Data-Driven Line Search Rule for Support Recovery in High-dimensional Data Analysis

no code implementations21 Nov 2021 Peili Li, Yuling Jiao, Xiliang Lu, Lican Kang

In this work, we consider the algorithm to the (nonlinear) regression problems with $\ell_0$ penalty.

regression

Non-Asymptotic Error Bounds for Bidirectional GANs

no code implementations NeurIPS 2021 Shiao Liu, Yunfei Yang, Jian Huang, Yuling Jiao, Yang Wang

Our results are also applicable to the Wasserstein bidirectional GAN if the target distribution is assumed to have a bounded support.

Relative Entropy Gradient Sampler for Unnormalized Distributions

no code implementations6 Oct 2021 Xingdong Feng, Yuan Gao, Jian Huang, Yuling Jiao, Xu Liu

We propose a relative entropy gradient sampler (REGS) for sampling from unnormalized distributions.

Coordinate Descent for MCP/SCAD Penalized Least Squares Converges Linearly

no code implementations18 Sep 2021 Yuling Jiao, Dingwei Li, Min Liu, Xiliang Lu

Recovering sparse signals from observed data is an important topic in signal/imaging processing, statistics and machine learning.

Convergence Analysis of Schr{ö}dinger-F{ö}llmer Sampler without Convexity

no code implementations10 Jul 2021 Yuling Jiao, Lican Kang, Yanyan Liu, Youzhou Zhou

Schr\"{o}dinger-F\"{o}llmer sampler (SFS) is a novel and efficient approach for sampling from possibly unnormalized distributions without ergodicity.

Deep Generative Learning via Schrödinger Bridge

no code implementations19 Jun 2021 Gefei Wang, Yuling Jiao, Qian Xu, Yang Wang, Can Yang

At the sample level, we derive our Schr\"{o}dinger Bridge algorithm by plugging the drift term estimated by a deep score estimator and a deep density ratio estimator into the Euler-Maruyama method.

Image Inpainting

An error analysis of generative adversarial networks for learning distributions

no code implementations27 May 2021 Jian Huang, Yuling Jiao, Zhen Li, Shiao Liu, Yang Wang, Yunfei Yang

This paper studies how well generative adversarial networks (GANs) learn probability distributions from finite samples.

Non-asymptotic Excess Risk Bounds for Classification with Deep Convolutional Neural Networks

no code implementations1 May 2021 Guohao Shen, Yuling Jiao, Yuanyuan Lin, Jian Huang

To establish these results, we derive an upper bound for the covering number for the class of general convolutional neural networks with a bias term in each convolutional layer, and derive new results on the approximation power of CNNs for any uniformly-continuous target functions.

Binary Classification

Sufficient and Disentangled Representation Learning

no code implementations1 Jan 2021 Jian Huang, Yuling Jiao, Xu Liao, Jin Liu, Zhou Yu

We provide strong statistical guarantees for the learned representation by establishing an upper bound on the excess error of the objective function and show that it reaches the nonparametric minimax rate under mild conditions.

Disentanglement

Toward Understanding Supervised Representation Learning with RKHS and GAN

no code implementations1 Jan 2021 Xu Liao, Jin Liu, Tianwen Wen, Yuling Jiao, Jian Huang

At the population level, we formulate the ideal representation learning task as that of finding a nonlinear map that minimizes the sum of losses characterizing conditional independence (with RKHS) and disentanglement (with GAN).

Disentanglement Image Classification

Generative Learning With Euler Particle Transport

no code implementations11 Dec 2020 Yuan Gao, Jian Huang, Yuling Jiao, Jin Liu, Xiliang Lu, Zhijian Yang

The key task in training is the estimation of the density ratios or differences that determine the residual maps.

Deep Dimension Reduction for Supervised Representation Learning

1 code implementation10 Jun 2020 Jian Huang, Yuling Jiao, Xu Liao, Jin Liu, Zhou Yu

We propose a deep dimension reduction approach to learning representations with these characteristics.

Dimensionality Reduction Disentanglement

Learning Implicit Generative Models with Theoretical Guarantees

no code implementations7 Feb 2020 Yuan Gao, Jian Huang, Yuling Jiao, Jin Liu

We then solve the McKean-Vlasov equation numerically using the forward Euler iteration, where the forward Euler map depends on the density ratio (density difference) between the distribution at current iteration and the underlying target distribution.

On Newton Screening

no code implementations27 Jan 2020 Jian Huang, Yuling Jiao, Lican Kang, Jin Liu, Yanyan Liu, Xiliang Lu, Yuanyuan Yang

Based on this KKT system, a built-in working set with a relatively small size is first determined using the sum of primal and dual variables generated from the previous iteration, then the primal variable is updated by solving a least-squares problem on the working set and the dual variable updated based on a closed-form expression.

Sparse Learning

A Support Detection and Root Finding Approach for Learning High-dimensional Generalized Linear Models

no code implementations16 Jan 2020 Jian Huang, Yuling Jiao, Lican Kang, Jin Liu, Yanyan Liu, Xiliang Lu

Feature selection is important for modeling high-dimensional data, where the number of variables can be much larger than the sample size.

feature selection

A stochastic alternating minimizing method for sparse phase retrieval

no code implementations14 Jun 2019 Jian-Feng Cai, Yuling Jiao, Xiliang Lu, Juntao You

Sparse phase retrieval plays an important role in many fields of applied science and thus attracts lots of attention.

Retrieval

Wasserstein-Wasserstein Auto-Encoders

no code implementations25 Feb 2019 Shunkang Zhang, Yuan Gao, Yuling Jiao, Jin Liu, Yang Wang, Can Yang

To address the challenges in learning deep generative models (e. g., the blurriness of variational auto-encoder and the instability of training generative adversarial networks, we propose a novel deep generative model, named Wasserstein-Wasserstein auto-encoders (WWAE).

Deep Generative Learning via Variational Gradient Flow

1 code implementation24 Jan 2019 Yuan Gao, Yuling Jiao, Yang Wang, Yao Wang, Can Yang, Shunkang Zhang

We propose a general framework to learn deep generative models via \textbf{V}ariational \textbf{Gr}adient Fl\textbf{ow} (VGrow) on probability spaces.

Binary Classification

SNAP: A semismooth Newton algorithm for pathwise optimization with optimal local convergence rate and oracle properties

no code implementations9 Oct 2018 Jian Huang, Yuling Jiao, Xiliang Lu, Yueyong Shi, Qinglong Yang

We propose a semismooth Newton algorithm for pathwise optimization (SNAP) for the LASSO and Enet in sparse, high-dimensional linear regression.

regression

A Primal Dual Active Set with Continuation Algorithm for the \ell^0-Regularized Optimization Problem

no code implementations3 Mar 2014 Yuling Jiao, Bangti Jin, Xiliang Lu

We develop a primal dual active set with continuation algorithm for solving the \ell^0-regularized least-squares problem that frequently arises in compressed sensing.

A Unified Primal Dual Active Set Algorithm for Nonconvex Sparse Recovery

no code implementations4 Oct 2013 Jian Huang, Yuling Jiao, Bangti Jin, Jin Liu, Xiliang Lu, Can Yang

In this paper, we consider the problem of recovering a sparse signal based on penalized least squares formulations.

Cannot find the paper you are looking for? You can Submit a new open access paper.