no code implementations • 1 Jan 2021 • Lei Wang, Runtian Zhai, Di He, LiWei Wang, Li Jian
For certification, we carefully allocate specific robust regions for each test sample.
no code implementations • 1 Jan 2021 • Jin Xu, Xu Tan, Renqian Luo, Kaitao Song, Li Jian, Tao Qin, Tie-Yan Liu
NAS-BERT trains a big supernet on a carefully designed search space containing various architectures and outputs multiple compressed models with adaptive sizes and latency.
no code implementations • 1 Jan 2021 • Pei Yingjun, Hou Xinwen, Li Jian, Lei Wang
We also show that our method achieves better performance than VIB and mutual information neural estimation (MINE), two other popular approaches to optimize the information bottleneck framework in supervised learning.
no code implementations • 23 Oct 2018 • Li Jian
Under the frequency domain paradigm, we proposed a global inhibition model to mimic this process by suppressing the {\it non-saliency} in the input image; we also show that the dynamic process is influenced by one parameter in the frequency domain.