Multi-Label Learning
82 papers with code • 1 benchmarks • 8 datasets
Multi-label learning (MLL) is a generalization of the binary and multi-category classification problems and deals with tagging a data instance with several possible class labels simultaneously [1]. Each of the assigned labels conveys a specific semantic relationship with the multi-label data instance [2, 3]. Multi-label learning has continued to receive a lot of research interest due to its practical application in many real-world problems such as recommender systems [4], image annotation [5], and text classification [6].
References:
-
Kumar, S., Rastogi, R., Low rank label subspace transformation for multi-label learning with missing labels. Information Sciences 596, 53–72 (2022)
-
Zhang M-L, Zhou Z-H (2013) A review on multi-label learning algorithms. IEEE Trans Knowl Data Eng 26(8):1819–1837
-
Gibaja E, Ventura S (2015) A tutorial on multilabel learning. ACM Comput Surveys (CSUR) 47(3):1–38
-
Bogaert M, Lootens J, Van den Poel D, Ballings M (2019) Evaluating multi-label classifiers and recommender systems in the financial service sector. Eur J Oper Res 279(2):620– 634
-
Jing L, Shen C, Yang L, Yu J, Ng MK (2017) Multi-label classification by semi-supervised singular value decomposition. IEEE Trans Image Process 26(10):4612–4625
-
Chen Z, Ren J (2021) Multi-label text classification with latent word-wise label information. Appl Intell 51(2):966–979
Datasets
Latest papers with no code
Exploiting Multi-Label Correlation in Label Distribution Learning
Label Distribution Learning (LDL) is a novel machine learning paradigm that assigns label distribution to each instance.
Pseudo Labels for Single Positive Multi-Label Learning
Then, we treat the teacher model's predictions on the training data as ground-truth labels to train a student network on fully-labeled images.
Hierarchical Multi-Instance Multi-Label Learning for Detecting Propaganda Techniques
Since the introduction of the SemEval 2020 Task 11 (Martino et al., 2020a), several approaches have been proposed in the literature for classifying propaganda based on the rhetorical techniques used to influence readers.
A Decentralized Spike-based Learning Framework for Sequential Capture in Discrete Perimeter Defense Problem
The output of MLC-SEFRON contains the labels of segments that a defender has to visit in order to protect the perimeter.
Understanding Label Bias in Single Positive Multi-Label Learning
Annotating data for multi-label classification is prohibitively expensive because every category of interest must be confirmed to be present or absent.
Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning
In this trend, similarity functions and Estimated Mutual Information (EMI) are used as learning and objective functions.
Deep Partial Multi-Label Learning with Graph Disambiguation
In partial multi-label learning (PML), each data example is equipped with a candidate label set, which consists of multiple ground-truth labels and other false-positive labels.
Graph based Label Enhancement for Multi-instance Multi-label learning
Multi-instance multi-label (MIML) learning is widely applicated in numerous domains, such as the image classification where one image contains multiple instances correlated with multiple logic labels simultaneously.
Learning Reliable Representations for Incomplete Multi-View Partial Multi-Label Classification
The application of multi-view contrastive learning has further facilitated this process, however, the existing multi-view contrastive learning methods crudely separate the so-called negative pair, which largely results in the separation of samples belonging to the same category or similar ones.
Pushing One Pair of Labels Apart Each Time in Multi-Label Learning: From Single Positive to Full Labels
In Multi-Label Learning (MLL), it is extremely challenging to accurately annotate every appearing object due to expensive costs and limited knowledge.