Multi-Label Learning
81 papers with code • 1 benchmarks • 7 datasets
Multi-label learning (MLL) is a generalization of the binary and multi-category classification problems and deals with tagging a data instance with several possible class labels simultaneously [1]. Each of the assigned labels conveys a specific semantic relationship with the multi-label data instance [2, 3]. Multi-label learning has continued to receive a lot of research interest due to its practical application in many real-world problems such as recommender systems [4], image annotation [5], and text classification [6].
References:
-
Kumar, S., Rastogi, R., Low rank label subspace transformation for multi-label learning with missing labels. Information Sciences 596, 53–72 (2022)
-
Zhang M-L, Zhou Z-H (2013) A review on multi-label learning algorithms. IEEE Trans Knowl Data Eng 26(8):1819–1837
-
Gibaja E, Ventura S (2015) A tutorial on multilabel learning. ACM Comput Surveys (CSUR) 47(3):1–38
-
Bogaert M, Lootens J, Van den Poel D, Ballings M (2019) Evaluating multi-label classifiers and recommender systems in the financial service sector. Eur J Oper Res 279(2):620– 634
-
Jing L, Shen C, Yang L, Yu J, Ng MK (2017) Multi-label classification by semi-supervised singular value decomposition. IEEE Trans Image Process 26(10):4612–4625
-
Chen Z, Ren J (2021) Multi-label text classification with latent word-wise label information. Appl Intell 51(2):966–979
Datasets
Latest papers
ProPML: Probability Partial Multi-label Learning
Partial Multi-label Learning (PML) is a type of weakly supervised learning where each training instance corresponds to a set of candidate labels, among which only some are true.
MIML library: a Modular and Flexible Library for Multi-instance Multi-label Learning
MIML library is a Java software tool to develop, test, and compare classification algorithms for multi-instance multi-label (MIML) learning.
Vision-Language Pseudo-Labels for Single-Positive Multi-Label Learning
In general multi-label learning, a model learns to predict multiple labels or categories for a single input image.
Neural Collapse in Multi-label Learning with Pick-all-label Loss
We study deep neural networks for the multi-label classification (MLab) task through the lens of neural collapse (NC).
Multi-Label Feature Selection Using Adaptive and Transformed Relevance
Multi-label learning has emerged as a crucial paradigm in data analysis, addressing scenarios where instances are associated with multiple class labels simultaneously.
Multi-Label Noise Transition Matrix Estimation with Label Correlations: Theory and Algorithm
However, estimating multi-label noise transition matrices remains a challenging task, as most existing estimators in noisy multi-class learning rely on anchor points and accurate fitting of noisy class posteriors, which is hard to satisfy in noisy multi-label learning.
Multi-Label Knowledge Distillation
Existing knowledge distillation methods typically work by imparting the knowledge of output logits or intermediate feature maps from the teacher network to the student network, which is very successful in multi-class single-label learning.
When Measures are Unreliable: Imperceptible Adversarial Perturbations toward Top-$k$ Multi-Label Learning
However, existing adversarial attacks toward multi-label learning only pursue the traditional visual imperceptibility but ignore the new perceptible problem coming from measures such as Precision@$k$ and mAP@$k$.
Semantic-Aware Dual Contrastive Learning for Multi-label Image Classification
Specifically, we leverage semantic-aware representation learning to extract category-related local discriminative features and construct category prototypes.
Minimal Learning Machine for Multi-Label Learning
Distance-based supervised method, the minimal learning machine, constructs a predictive model from data by learning a mapping between input and output distance matrices.