Multi-Label Text Classification

70 papers with code • 20 benchmarks • 13 datasets

According to Wikipedia "In machine learning, multi-label classification and the strongly related problem of multi-output classification are variants of the classification problem where multiple labels may be assigned to each instance. Multi-label classification is a generalization of multiclass classification, which is the single-label problem of categorizing instances into precisely one of more than two classes; in the multi-label problem there is no constraint on how many of the classes the instance can be assigned to."

Libraries

Use these libraries to find Multi-Label Text Classification models and implementations
2 papers
489

Most implemented papers

Towards Scalable and Reliable Capsule Networks for Challenging NLP Applications

andyweizhao/capsule_text_classification ACL 2019

Obstacles hindering the development of capsule networks for challenging NLP applications include poor scalability to large output spaces and less reliable routing processes.

Investigating Capsule Networks with Dynamic Routing for Text Classification

andyweizhao/capsule_text_classification EMNLP 2018

In this study, we explore capsule networks with dynamic routing for text classification.

ML-Net: multi-label classification of biomedical texts with deep neural networks

jingcheng-du/ML_Net-1 13 Nov 2018

Due to this nature, the multi-label text classification task is often considered to be more challenging compared to the binary or multi-class text classification problems.

AttentionXML: Label Tree-based Attention-Aware Deep Model for High-Performance Extreme Multi-Label Text Classification

yourh/AttentionXML NeurIPS 2019

We propose a new label tree-based deep learning model for XMTC, called AttentionXML, with two unique features: 1) a multi-label attention mechanism with raw text as input, which allows to capture the most relevant part of text to each label; and 2) a shallow and wide probabilistic label tree (PLT), which allows to handle millions of labels, especially for "tail labels".

Correlation Networks for Extreme Multi-label Text Classification

XunGuangxu/CorNet Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining 2022

This paper develops the Correlation Networks (CorNet) architecture for the extreme multi-label text classification (XMTC) task, where the objective is to tag an input text sequence with the most relevant subset of labels from an extremely large label set.

MIMIC-III, a freely accessible critical care database

mit-lcp/mimic-iii-paper Nature 2016

MIMIC-III (‘Medical Information Mart for Intensive Care’) is a large, single-center database comprising information relating to patients admitted to critical care units at a large tertiary care hospital.

Comprehensive Evaluation of Deep Learning Architectures for Prediction of DNA/RNA Sequence Binding Specificities

MedChaabane/deepRAM 29 Jan 2019

For this purpose, we present deepRAM, an end-to-end deep learning tool that provides an implementation of novel and previously proposed architectures; its fully automatic model selection procedure allows us to perform a fair and unbiased comparison of deep learning architectures.

Taming Pretrained Transformers for Extreme Multi-label Text Classification

OctoberChang/X-Transformer 7 May 2019

However, naively applying deep transformer models to the XMC problem leads to sub-optimal performance due to the large output space and the label sparsity issue.

MAGNET: Multi-Label Text Classification using Attention-based Graph Neural Network

akash18tripathi/MAGNET-Multi-Label-Text-Classi-cation-using-Attention-based-Graph-Neural-Network 12th International Conference on Agents and Artificial Intelligence ICAART 2020

The graph attention network uses a feature matrix and a correlation matrix to capture and explore the crucial dependencies between the labels and generate classifiers for the task.

Multi-Label Text Classification using Attention-based Graph Neural Network

adrinta/MAGNET 22 Mar 2020

The graph attention network uses a feature matrix and a correlation matrix to capture and explore the crucial dependencies between the labels and generate classifiers for the task.