Search Results for author: Sheng-Jun Huang

Found 32 papers, 11 papers with code

Cost-effectively Identifying Causal Effect When Only Response Variable Observable

no code implementations ICML 2020 Tian-Zuo Wang, Xi-Zhu Wu, Sheng-Jun Huang, Zhi-Hua Zhou

In many real tasks, we care about how to make decisions other than mere predictions on an event, e. g. how to increase the revenue next month instead of knowing it will drop.

Decision Making

Counterfactual Reasoning for Multi-Label Image Classification via Patching-Based Training

no code implementations9 Apr 2024 Ming-Kun Xie, Jia-Hao Xiao, Pei Peng, Gang Niu, Masashi Sugiyama, Sheng-Jun Huang

In this paper, we provide a causal inference framework to show that the correlative features caused by the target object and its co-occurring objects can be regarded as a mediator, which has both positive and negative impacts on model predictions.

Causal Inference counterfactual +3

Bidirectional Uncertainty-Based Active Learning for Open Set Annotation

no code implementations23 Feb 2024 Chen-Chen Zong, Ye-Wen Wang, Kun-Peng Ning, Haibo Ye, Sheng-Jun Huang

In this paper, we attempt to query examples that are both likely from known classes and highly informative, and propose a \textit{Bidirectional Uncertainty-based Active Learning} (BUAL) framework.

Active Learning

Empowering Language Models with Active Inquiry for Deeper Understanding

no code implementations6 Feb 2024 Jing-Cheng Pang, Heng-Bo Fan, Pengyuan Wang, Jia-Hao Xiao, Nan Tang, Si-Hang Yang, Chengxing Jia, Sheng-Jun Huang, Yang Yu

The rise of large language models (LLMs) has revolutionized the way that we interact with artificial intelligence systems through natural language.

Active Learning Language Modelling +1

Dirichlet-Based Prediction Calibration for Learning with Noisy Labels

1 code implementation13 Jan 2024 Chen-Chen Zong, Ye-Wen Wang, Ming-Kun Xie, Sheng-Jun Huang

Learning with noisy labels can significantly hinder the generalization performance of deep neural networks (DNNs).

Learning with noisy labels Translation

Improving Lens Flare Removal with General Purpose Pipeline and Multiple Light Sources Recovery

1 code implementation31 Aug 2023 Yuyan Zhou, Dong Liang, Songcan Chen, Sheng-Jun Huang, Shuo Yang, Chongyi Li

In this paper, we propose a solution to improve the performance of lens flare removal by revisiting the ISP and remodeling the principle of automatic exposure in the synthesis pipeline and design a more reliable light sources recovery strategy.

Flare Removal Tone Mapping

Multi-Label Knowledge Distillation

1 code implementation ICCV 2023 Penghui Yang, Ming-Kun Xie, Chen-Chen Zong, Lei Feng, Gang Niu, Masashi Sugiyama, Sheng-Jun Huang

Existing knowledge distillation methods typically work by imparting the knowledge of output logits or intermediate feature maps from the teacher network to the student network, which is very successful in multi-class single-label learning.

Binary Classification Knowledge Distillation +1

ALL-E: Aesthetics-guided Low-light Image Enhancement

no code implementations28 Apr 2023 Ling Li, Dong Liang, Yuanhang Gao, Sheng-Jun Huang, Songcan Chen

In this paper, we propose a new paradigm, i. e., aesthetics-guided low-light image enhancement (ALL-E), which introduces aesthetic preferences to LLE and motivates training in a reinforcement learning framework with an aesthetic reward.

Low-Light Image Enhancement valid

Implicit Stochastic Gradient Descent for Training Physics-informed Neural Networks

no code implementations3 Mar 2023 Ye Li, Song-Can Chen, Sheng-Jun Huang

Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems, but they are still trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.

Improving Lens Flare Removal with General-Purpose Pipeline and Multiple Light Sources Recovery

1 code implementation ICCV 2023 Yuyan Zhou, Dong Liang, Songcan Chen, Sheng-Jun Huang, Shuo Yang, Chongyi Li

In this paper, we propose a solution to improve the performance of lens flare removal by revisiting the ISP and remodeling the principle of automatic exposure in the synthesis pipeline and design a more reliable light sources recovery strategy.

Flare Removal Tone Mapping

MUS-CDB: Mixed Uncertainty Sampling with Class Distribution Balancing for Active Annotation in Aerial Object Detection

1 code implementation6 Dec 2022 Dong Liang, Jing-Wei Zhang, Ying-Peng Tang, Sheng-Jun Huang

However, existing active learning methods are mainly with class-balanced settings and image-based querying for generic object detection tasks, which are less applicable to aerial object detection scenarios due to the long-tailed class distribution and dense small objects in aerial scenes.

Active Object Detection Informativeness +3

Noise-Robust Bidirectional Learning with Dynamic Sample Reweighting

1 code implementation3 Sep 2022 Chen-Chen Zong, Zheng-Tao Cao, Hong-Tao Guo, Yun Du, Ming-Kun Xie, Shao-Yuan Li, Sheng-Jun Huang

Deep neural networks trained with standard cross-entropy loss are more prone to memorize noisy labels, which degrades their performance.

Meta Objective Guided Disambiguation for Partial Label Learning

no code implementations26 Aug 2022 Bo-Shi Zou, Ming-Kun Xie, Sheng-Jun Huang

In this paper, we propose a novel framework for partial label learning with meta objective guided disambiguation (MoGD), which aims to recover the ground-truth label from candidate labels set by solving a meta objective on a small validation set.

Partial Label Learning valid +1

A Deep Model for Partial Multi-Label Image Classification with Curriculum Based Disambiguation

no code implementations6 Jul 2022 Feng Sun, Ming-Kun Xie, Sheng-Jun Huang

In this paper, we study the partial multi-label (PML) image classification problem, where each image is annotated with a candidate label set consists of multiple relevant labels and other noisy labels.

Multi-Label Image Classification

Can Adversarial Training Be Manipulated By Non-Robust Features?

1 code implementation31 Jan 2022 Lue Tao, Lei Feng, Hongxin Wei, JinFeng Yi, Sheng-Jun Huang, Songcan Chen

Under this threat, we show that adversarial training using a conventional defense budget $\epsilon$ provably fails to provide test robustness in a simple statistical setting, where the non-robust features of the training data can be reinforced by $\epsilon$-bounded perturbation.

Active Learning for Open-set Annotation

1 code implementation CVPR 2022 Kun-Peng Ning, Xun Zhao, Yu Li, Sheng-Jun Huang

To tackle this open-set annotation (OSA) problem, we propose a new active learning framework called LfOSA, which boosts the classification performance with an effective sampling strategy to precisely detect examples from known classes for annotation.

Active Learning

Learning from Crowds with Sparse and Imbalanced Annotations

no code implementations11 Jul 2021 Ye Shi, Shao-Yuan Li, Sheng-Jun Huang

Traditional supervised learning requires ground truth labels for the training data, whose collection can be difficult in many cases.

Image Classification

CCMN: A General Framework for Learning with Class-Conditional Multi-Label Noise

no code implementations16 May 2021 Ming-Kun Xie, Sheng-Jun Huang

Class-conditional noise commonly exists in machine learning tasks, where the class label is corrupted with a probability depending on its ground-truth.

Multi-Label Learning

Co-Imitation Learning without Expert Demonstration

no code implementations27 Mar 2021 Kun-Peng Ning, Hu Xu, Kun Zhu, Sheng-Jun Huang

Imitation learning is a primary approach to improve the efficiency of reinforcement learning by exploiting the expert demonstrations.

Imitation Learning

Improving Model Robustness by Adaptively Correcting Perturbation Levels with Active Queries

no code implementations27 Mar 2021 Kun-Peng Ning, Lue Tao, Songcan Chen, Sheng-Jun Huang

Recently, much research has been devoted to improving the model robustness by training with noise perturbations.

Active Learning

Better Safe Than Sorry: Preventing Delusive Adversaries with Adversarial Training

2 code implementations NeurIPS 2021 Lue Tao, Lei Feng, JinFeng Yi, Sheng-Jun Huang, Songcan Chen

Delusive attacks aim to substantially deteriorate the test accuracy of the learning model by slightly perturbing the features of correctly labeled training examples.

Reinforcement Learning with Supervision from Noisy Demonstrations

no code implementations14 Jun 2020 Kun-Peng Ning, Sheng-Jun Huang

In this paper, we propose a novel framework to adaptively learn the policy by jointly interacting with the environment and exploiting the expert demonstrations.

reinforcement-learning Reinforcement Learning (RL)

ALiPy: Active Learning in Python

3 code implementations12 Jan 2019 Ying-Peng Tang, Guo-Xiang Li, Sheng-Jun Huang

Supervised machine learning methods usually require a large set of labeled examples for model training.

Active Learning

Recent Advances in Open Set Recognition: A Survey

no code implementations21 Nov 2018 Chuanxing Geng, Sheng-Jun Huang, Songcan Chen

A more realistic scenario is open set recognition (OSR), where incomplete knowledge of the world exists at training time, and unknown classes can be submitted to an algorithm during testing, requiring the classifiers to not only accurately classify the seen classes, but also effectively deal with the unseen ones.

General Classification Open Set Learning

Active Feature Acquisition with Supervised Matrix Completion

no code implementations15 Feb 2018 Sheng-Jun Huang, Miao Xu, Ming-Kun Xie, Masashi Sugiyama, Gang Niu, Songcan Chen

Feature missing is a serious problem in many applications, which may lead to low quality of training data and further significantly degrade the learning performance.

Matrix Completion

Cost-Effective Training of Deep CNNs with Active Model Adaptation

no code implementations15 Feb 2018 Sheng-Jun Huang, Jia-Wei Zhao, Zhao-Yang Liu

Deep convolutional neural networks have achieved great success in various applications.

WoCE: a framework for clustering ensemble by exploiting the wisdom of Crowds theory

no code implementations20 Dec 2016 Muhammad Yousefnezhad, Sheng-Jun Huang, Daoqiang Zhang

We employ four conditions in the WOC theory, i. e., diversity, independency, decentralization and aggregation, to guide both the constructing of individual clustering results and the final combination for clustering ensemble.

Clustering Clustering Ensemble

Fast Multi-Instance Multi-Label Learning

no code implementations8 Oct 2013 Sheng-Jun Huang, Zhi-Hua Zhou

Although the MIML problem is complicated, MIMLfast is able to achieve excellent performance by exploiting label relations with shared space and discovering sub-concepts for complicated labels.

Multi-Label Learning

Active Learning by Querying Informative and Representative Examples

no code implementations NeurIPS 2010 Sheng-Jun Huang, Rong Jin, Zhi-Hua Zhou

Most active learning approaches select either informative or representative unlabeled instances to query their labels.

Active Learning Informativeness

Cannot find the paper you are looking for? You can Submit a new open access paper.