MAGNET: Multi-Label Text Classification using Attention-based Graph Neural Network

12th International Conference on Agents and Artificial Intelligence ICAART 2020  ·  Ankit Pal, Muru Selvakumar and Malaikannan Sankarasubbu ·

In Multi-Label Text Classification (MLTC), one sample can belong to more than one class. It is observed that most MLTC tasks, there are dependencies or correlations among labels... Existing methods tend to ignore the relationship among labels. In this paper, a graph attention network-based model is proposed to capture the attentive dependency structure among the labels. The graph attention network uses a feature matrix and a correlation matrix to capture and explore the crucial dependencies between the labels and generate classifiers for the task. The generated classifiers are applied to sentence feature vectors obtained from the text feature extraction network(BiLSTM) to enable end-to-end training. Attention allows the system to assign different weights to neighbor nodes per label, thus allowing it to learn the dependencies among labels implicitly. The results of the proposed model are validated on five real-world MLTC datasets. The proposed model achieves similar or better performance compared to the previous state-of-the-art models. read more

PDF

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Document Classification AAPD MAGNET F1 69.6 # 2
Multi-Label Text Classification AAPD MAGNET F1 69.6 # 1
Text Classification RCV1 MAGNET Micro F1 88.5 # 1
Multi-Label Text Classification RCV1-v2 MAGNET Micro-F1 88.5 # 1
Document Classification Reuters-21578 MAGNET F1 89.9 # 1
Multi-Label Text Classification Reuters-21578 MAGNET Micro-F1 89.9 # 1
Multi-Label Text Classification Slashdot MAGNET Micro-F1 56.8 # 1

Methods


No methods listed for this paper. Add relevant methods here