no code implementations • 4 Apr 2024 • Haiyun He, Christina Lee Yu, Ziv Goldfeld
This enables refining our generalization bounds to capture the contraction as a function of the network architecture parameters.
no code implementations • 15 Oct 2022 • Haiyun He, Gholamali Aminian, Yuheng Bu, Miguel Rodrigues, Vincent Y. F. Tan
Our findings offer new insights that the generalization performance of SSL with pseudo-labeling is affected not only by the information between the output hypothesis and input training data but also by the information {\em shared} between the {\em labeled} and {\em pseudo-labeled} data samples.
1 code implementation • 3 Oct 2021 • Haiyun He, Hanshu Yan, Vincent Y. F. Tan
Using information-theoretic principles, we consider the generalization error (gen-error) of iterative semi-supervised learning (SSL) algorithms that iteratively generate pseudo-labels for a large amount of unlabelled data to progressively refine the model parameters.
no code implementations • 29 Sep 2021 • Haiyun He, Hanshu Yan, Vincent Tan
We consider iterative semi-supervised learning (SSL) algorithms that iteratively generate pseudo-labels for a large amount unlabelled data to progressively refine the model parameters.
no code implementations • 13 Mar 2020 • Haiyun He, Qiaosheng Zhang, Vincent Y. F. Tan
This paper investigates a novel offline change-point detection problem from an information-theoretic perspective.