A Hierarchical Model for Data-to-Text Generation

20 Dec 2019  ·  Clément Rebuffel, Laure Soulier, Geoffrey Scoutheeten, Patrick Gallinari ·

Transcribing structured data into natural language descriptions has emerged as a challenging task, referred to as "data-to-text". These structures generally regroup multiple elements, as well as their attributes. Most attempts rely on translation encoder-decoder methods which linearize elements into a sequence. This however loses most of the structure contained in the data. In this work, we propose to overpass this limitation with a hierarchical model that encodes the data-structure at the element-level and the structure level. Evaluations on RotoWire show the effectiveness of our model w.r.t. qualitative and quantitative metrics.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Data-to-Text Generation RotoWire Hierarchical transformer encoder + conditional copy BLEU 17.50 # 2
Data-to-Text Generation RotoWire (Content Ordering) Hierarchical Transformer Encoder + conditional copy DLD 18.90% # 1
BLEU 17.50 # 1
Data-to-Text Generation Rotowire (Content Selection) Hierarchical Transformer Encoder + conditional copy Precision 39.47% # 1
Recall 51.64% # 2
Data-to-Text Generation RotoWire (Relation Generation) Hierarchical Transformer Encoder + conditional copy count 21.17 # 6
Precision 89.46% # 4

Methods


No methods listed for this paper. Add relevant methods here