no code implementations • NeurIPS 2021 • Haiying Wang, Aonan Zhang, Chong Wang
We first prove that, with imbalanced data, the available information about unknown parameters is only tied to the relatively small number of positive instances, which justifies the usage of negative sampling.
no code implementations • 3 Oct 2021 • Jinke Wang, Xiangyang Zhang, Peiqing Lv, Lubiao Zhou, Haiying Wang
In the Sliver07 evaluation, the proposed method achieved the best segmentation performance on all five standard metrics.
no code implementations • 30 May 2021 • Jianning Wu, Zhuqing Jiang, Shiping Wen, Aidong Men, Haiying Wang
For multimodal tasks, a good feature extraction network should extract information as much as possible and ensure that the extracted feature embedding and other modal feature embedding have an excellent mutual understanding.
no code implementations • 24 May 2021 • Ting Pan, Zhuqing Jiang, Jianan Han, Shiping Wen, Aidong Men, Haiying Wang
We propose a two-branch seq-to-seq deep model to disentangle the Taylor feature and the residual feature in video frames by a novel recurrent prediction module (TaylorCell) and residual module.
no code implementations • 11 Mar 2021 • Jinke Wang, Peiqing Lv, Haiying Wang, Changfa Shi
Background and objective: In this paper, a modified U-Net based framework is presented, which leverages techniques from Squeeze-and-Excitation (SE) block, Atrous Spatial Pyramid Pooling (ASPP) and residual learning for accurate and robust liver CT segmentation, and the effectiveness of the proposed method was tested on two public datasets LiTS17 and SLiver07.
no code implementations • 20 Jan 2021 • Zhuqing Jiang, Chang Liu, Ya'nan Wang, Kai Li, Aidong Men, Haiying Wang, Haiyong Luo
With the goal of tuning up the brightness, low-light image enhancement enjoys numerous applications, such as surveillance, remote sensing and computational photography.
no code implementations • 4 Jan 2021 • Ya'nan Wang, Zhuqing Jiang, Chang Liu, Kai Li, Aidong Men, Haiying Wang
This paper proposes a neural network for multi-level low-light image enhancement, which is user-friendly to meet various requirements by selecting different images as brightness reference.
no code implementations • 3 Jan 2021 • Zhuqing Jiang, Haotian Li, Liangjie Liu, Aidong Men, Haiying Wang
The generated reflectance, which is assumed to be irrelevant of illumination by Retinex, is treated as enhanced brightness.
no code implementations • 11 Nov 2020 • Haiying Wang, Jae Kwang Kim
Subsampling is a computationally effective approach to extract information from massive data sets when computing resources are limited.
1 code implementation • 4 Oct 2020 • Haim Bar, Haiying Wang
This paper proposes a procedure to execute external source codes from a LaTeX document and include the calculation outputs in the resulting Portable Document Format (pdf) file automatically.
no code implementations • ICML 2020 • HaiYing Wang
We first derive the asymptotic distribution of the maximum likelihood estimator (MLE) of the unknown parameter, which shows that the asymptotic variance convergences to zero in a rate of the inverse of the number of the events instead of the inverse of the full data sample size.
no code implementations • 21 May 2020 • Jun Yu, HaiYing Wang, Mingyao Ai, Huiming Zhang
We first derive optimal Poisson subsampling probabilities in the context of quasi-likelihood estimation under the A- and L-optimality criteria.
no code implementations • 5 Sep 2018 • Yishu Xue, Haiying Wang, Jun Yan, Elizabeth D. Schifano
The Cox model, which remains as the first choice in analyzing time-to-event data even for large datasets, relies on the proportional hazards assumption.
Methodology
no code implementations • 6 Apr 2017 • Terence Fusco, Yaxin Bi, Haiying Wang, Fiona Browne
The key issues pertaining to collection of epidemic disease data for our analysis purposes are that it is a labour intensive, time consuming and expensive process resulting in availability of sparse sample data which we use to develop prediction models.
no code implementations • 3 Feb 2017 • HaiYing Wang, Rong Zhu, Ping Ma
In this paper, we propose fast subsampling algorithms to efficiently approximate the maximum likelihood estimate in logistic regression.