Search Results for author: Jiangfan Han

Found 3 papers, 0 papers with code

Fixing the Teacher-Student Knowledge Discrepancy in Distillation

no code implementations31 Mar 2021 Jiangfan Han, Mengya Gao, Yujie Wang, Quanquan Li, Hongsheng Li, Xiaogang Wang

To solve this problem, in this paper, we propose a novel student-dependent distillation method, knowledge consistent distillation, which makes teacher's knowledge more consistent with the student and provides the best suitable knowledge to different student networks for distillation.

Image Classification Knowledge Distillation +2

Once a MAN: Towards Multi-Target Attack via Learning Multi-Target Adversarial Network Once

no code implementations ICCV 2019 Jiangfan Han, Xiaoyi Dong, Ruimao Zhang, Dong-Dong Chen, Weiming Zhang, Nenghai Yu, Ping Luo, Xiaogang Wang

Recently, generation-based methods have received much attention since they directly use feed-forward networks to generate the adversarial samples, which avoid the time-consuming iterative attacking procedure in optimization-based and gradient-based methods.

Classification General Classification

Deep Self-Learning From Noisy Labels

no code implementations ICCV 2019 Jiangfan Han, Ping Luo, Xiaogang Wang

Unlike previous works constrained by many conditions, making them infeasible to real noisy cases, this work presents a novel deep self-learning framework to train a robust network on the real noisy datasets without extra supervision.

Learning with noisy labels Self-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.