no code implementations • 7 May 2024 • Hamed Hemati, Lorenzo Pellegrini, Xiaotian Duan, Zixuan Zhao, Fangfang Xia, Marc Masana, Benedikt Tscheschner, Eduardo Veas, Yuxiang Zheng, Shiji Zhao, Shao-Yuan Li, Sheng-Jun Huang, Vincenzo Lomonaco, Gido M. van de Ven
Continual learning (CL) provides a framework for training models in ever-evolving environments.
no code implementations • 7 May 2023 • Wenhai Wan, Xinrui Wang, Ming-Kun Xie, Shao-Yuan Li, Sheng-Jun Huang, Songcan Chen
Learning from noisy data has attracted much attention, where most methods focus on closed-set label noise.
1 code implementation • 3 Sep 2022 • Chen-Chen Zong, Zheng-Tao Cao, Hong-Tao Guo, Yun Du, Ming-Kun Xie, Shao-Yuan Li, Sheng-Jun Huang
Deep neural networks trained with standard cross-entropy loss are more prone to memorize noisy labels, which degrades their performance.
no code implementations • 11 Jul 2021 • Ye Shi, Shao-Yuan Li, Sheng-Jun Huang
Traditional supervised learning requires ground truth labels for the training data, whose collection can be difficult in many cases.
no code implementations • 4 Aug 2015 • Shao-Yuan Li, Yuan Jiang, Zhi-Hua Zhou
Multi-label active learning is a hot topic in reducing the label cost by optimally choosing the most valuable instance to query its label from an oracle.