no code implementations • 16 Jan 2022 • Varun Mannam, Xiaotong Yuan, Scott Howard
Fluorescence lifetime imaging microscopy (FLIM) is an important technique to understand the chemical micro-environment in cells and tissues since it provides additional contrast compared to conventional fluorescence imaging.
1 code implementation • NeurIPS 2021 • Pan Zhou, Hanshu Yan, Xiaotong Yuan, Jiashi Feng, Shuicheng Yan
Specifically, we prove that lookahead using SGD as its inner-loop optimizer can better balance the optimization error and generalization error to achieve smaller excess risk error than vanilla SGD on (strongly) convex problems and nonconvex problems with Polyak-{\L}ojasiewicz condition which has been observed/proved in neural networks.
no code implementations • 29 Sep 2021 • Xiaotong Yuan, Ping Li
We further substantialize these generic results to SGD to derive improved high probability generalization bounds for convex or non-convex optimization with natural time decaying learning rates, which have not been possible to prove with the existing uniform stability results.
1 code implementation • 7 Mar 2021 • Varun Mannam, Yide Zhang, Xiaotong Yuan, Takashi Hato, Pierre C. Dagher, Evan L. Nichols, Cody J. Smith, Kenneth W. Dunn, Scott Howard
By integrating image denoising using the trained deep learning model on the FLIM data, provide accurate FLIM phasor measurements are obtained.
1 code implementation • 7 Mar 2021 • Varun Mannam, Yide Zhang, Xiaotong Yuan, Scott Howard
However, using the new approach, a network can be trained to achieve super-resolution images from this small dataset.
no code implementations • 20 Aug 2018 • Feng Zhou, Renlong Hang, Qingshan Liu, Xiaotong Yuan
Specifically, for each pixel, we feed its spectral values in different channels into Spectral LSTM one by one to learn the spectral feature.
no code implementations • NeurIPS 2017 • Guangcan Liu, Qingshan Liu, Xiaotong Yuan
To break through the limits of random sampling, this paper introduces a new hypothesis called \emph{isomeric condition}, which is provably weaker than the assumption of uniform sampling and arguably holds even when the missing data is placed irregularly.
no code implementations • NeurIPS 2016 • Xiaotong Yuan, Ping Li, Tong Zhang, Qingshan Liu, Guangcan Liu
We investigate a subclass of exponential family graphical models of which the sufficient statistics are defined by arbitrary additive forms.
no code implementations • NeurIPS 2016 • Xiaotong Yuan, Ping Li, Tong Zhang
In this paper, we bridge this gap by showing, for the first time, that exact recovery of the global sparse minimizer is possible for HTP-style methods under restricted strong condition number bounding conditions.
no code implementations • 10 Jul 2011 • Bao-Gang Hu, Ran He, Xiaotong Yuan
This work presents a systematic study of objective evaluations of abstaining classifications using Information-Theoretic Measures (ITMs).