no code implementations • 9 Feb 2023 • Jiangshe Zhang, Lizhen Ji, Fei Gao, Mengyao Li
A crucial assumption underlying the most current theory of machine learning is that the training distribution is identical to the test distribution.
no code implementations • 9 Feb 2023 • Jiangshe Zhang, Lizhen Ji, Meng Wang
In this paper, we propose an information theoretical importance sampling based approach for clustering problems (ITISC) which minimizes the worst case of expected distortions under the constraint of distribution deviation.
1 code implementation • 29 Dec 2020 • Shuang Xu, Lizhen Ji, Zhe Wang, Pengfei Li, Kai Sun, Chunxia Zhang, Jiangshe Zhang
According to the idea that each local region in the fused image should be similar to the sharpest one among source images, this paper presents an optimization-based approach to reduce defocus spread effects.