Search Results for author: Yu-Feng Li

Found 17 papers, 4 papers with code

Investigating the Limitation of CLIP Models: The Worst-Performing Categories

no code implementations5 Oct 2023 Jie-Jing Shao, Jiang-Xin Shi, Xiao-Wen Yang, Lan-Zhe Guo, Yu-Feng Li

Contrastive Language-Image Pre-training (CLIP) provides a foundation model by integrating natural language into visual concepts, enabling zero-shot recognition on downstream tasks.

Prompt Engineering Zero-Shot Learning

Parameter-Efficient Long-Tailed Recognition

1 code implementation18 Sep 2023 Jiang-Xin Shi, Tong Wei, Zhi Zhou, Xin-Yan Han, Jie-Jing Shao, Yu-Feng Li

In this paper, we propose PEL, a fine-tuning method that can effectively adapt pre-trained models to long-tailed recognition tasks in fewer than 20 epochs without the need for extra data.

 Ranked #1 on Long-tail Learning on CIFAR-100-LT (ρ=10) (using extra training data)

Fine-Grained Image Classification Long-tail learning with class descriptors

A Survey on Extreme Multi-label Learning

4 code implementations8 Oct 2022 Tong Wei, Zhen Mao, Jiang-Xin Shi, Yu-Feng Li, Min-Ling Zhang

Multi-label learning has attracted significant attention from both academic and industry field in recent decades.

Multi-Label Learning

LAMDA-SSL: Semi-Supervised Learning in Python

1 code implementation9 Aug 2022 Lin-Han Jia, Lan-Zhe Guo, Zhi Zhou, Yu-Feng Li

The second part shows the usage of LAMDA-SSL by abundant examples in detail.

Robust Deep Semi-Supervised Learning: A Brief Introduction

no code implementations12 Feb 2022 Lan-Zhe Guo, Zhi Zhou, Yu-Feng Li

Semi-supervised learning (SSL) is the branch of machine learning that aims to improve learning performance by leveraging unlabeled data when labels are insufficient.

STEP: Out-of-Distribution Detection in the Presence of Limited In-Distribution Labeled Data

no code implementations NeurIPS 2021 Zhi Zhou, Lan-Zhe Guo, Zhanzhan Cheng, Yu-Feng Li, ShiLiang Pu

However, in many real-world applications, it is desirable to have SSL algorithms that not only classify the samples drawn from the same distribution of labeled data but also detect out-of-distribution (OOD) samples drawn from an unknown distribution.

Out-of-Distribution Detection Out of Distribution (OOD) Detection

Prototypical Classifier for Robust Class-Imbalanced Learning

no code implementations22 Oct 2021 Tong Wei, Jiang-Xin Shi, Yu-Feng Li, Min-Ling Zhang

Deep neural networks have been shown to be very powerful methods for many supervised learning tasks.

Learning with noisy labels

Dash: Semi-Supervised Learning with Dynamic Thresholding

no code implementations1 Sep 2021 Yi Xu, Lei Shang, Jinxing Ye, Qi Qian, Yu-Feng Li, Baigui Sun, Hao Li, Rong Jin

In this work we develop a simple yet powerful framework, whose key idea is to select a subset of training examples from the unlabeled data when performing existing SSL methods so that only the unlabeled examples with pseudo labels related to the labeled data will be used to train models.

Semi-Supervised Image Classification

Robust Long-Tailed Learning under Label Noise

no code implementations26 Aug 2021 Tong Wei, Jiang-Xin Shi, Wei-Wei Tu, Yu-Feng Li

To overcome this limitation, we establish a new prototypical noise detection method by designing a distance-based metric that is resistant to label noise.

Image Classification

NGC: A Unified Framework for Learning with Open-World Noisy Data

no code implementations ICCV 2021 Zhi-Fan Wu, Tong Wei, Jianwen Jiang, Chaojie Mao, Mingqian Tang, Yu-Feng Li

The existence of noisy data is prevalent in both the training and testing phases of machine learning systems, which inevitably leads to the degradation of model performance.

Image Classification

Improving Tail Label Prediction for Extreme Multi-label Learning

no code implementations1 Jan 2021 Tong Wei, Wei-Wei Tu, Yu-Feng Li

Extreme multi-label learning (XML) works to annotate objects with relevant labels from an extremely large label set.

Multi-Label Learning

Weakly Supervised Learning Meets Ride-Sharing User Experience Enhancement

no code implementations19 Jan 2020 Lan-Zhe Guo, Feng Kuang, Zhang-Xun Liu, Yu-Feng Li, Nan Ma, Xiao-Hu Qie

For example, in user experience enhancement from Didi, one of the largest online ride-sharing platforms, the ride comment data contains severe label noise (due to the subjective factors of passengers) and severe label distribution bias (due to the sampling bias).

Weakly-supervised Learning

Reliable Weakly Supervised Learning: Maximize Gain and Maintain Safeness

no code implementations22 Apr 2019 Lan-Zhe Guo, Yu-Feng Li, Ming Li, Jin-Feng Yi, Bo-Wen Zhou, Zhi-Hua Zhou

We guide the optimization of label quality through a small amount of validation data, and to ensure the safeness of performance while maximizing performance gain.

Weakly-supervised Learning

Convex and Scalable Weakly Labeled SVMs

no code implementations6 Mar 2013 Yu-Feng Li, Ivor W. Tsang, James T. Kwok, Zhi-Hua Zhou

In this paper, we study the problem of learning from weakly labeled data, where labels of the training examples are incomplete.

Clustering Information Retrieval +1

Nyström Method vs Random Fourier Features: A Theoretical and Empirical Comparison

no code implementations NeurIPS 2012 Tianbao Yang, Yu-Feng Li, Mehrdad Mahdavi, Rong Jin, Zhi-Hua Zhou

Both random Fourier features and the Nyström method have been successfully applied to efficient kernel learning.

Cannot find the paper you are looking for? You can Submit a new open access paper.