Multi-Task Attentive Residual Networks for Argument Mining

24 Feb 2021  ยท  Andrea Galassi, Marco Lippi, Paolo Torroni ยท

We explore the use of residual networks and neural attention for multiple argument mining tasks. We propose a residual architecture that exploits attention, multi-task learning, and makes use of ensemble, without any assumption on document or argument structure. We present an extensive experimental evaluation on five different corpora of user-generated comments, scientific publications, and persuasive essays. Our results show that our approach is a strong competitor against state-of-the-art architectures with a higher computational footprint or corpus-specific design, representing an interesting compromise between generality, performance accuracy and reduced model size.

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Relation Classification AbstRCT - Neoplasm ResAttArg Macro F1 70.92 # 1
Link Prediction AbstRCT - Neoplasm ResAttArg F1 54.43 # 1
Link Prediction CDCP ResAttArg F1 29.73 # 1
Relation Classification CDCP ResAttArg Macro F1 42.95 # 1
Component Classification CDCP ResAttArg Macro F1 78.71 # 1
Link Prediction DRI Corpus ResAttArg F1 43.66 # 1
Relation Classification DRI Corpus ResAttArg Macro F1 37.72 # 1

Methods


No methods listed for this paper. Add relevant methods here