Data-to-Text Generation with Content Selection and Planning

3 Sep 2018  ·  Ratish Puduppully, Li Dong, Mirella Lapata ·

Recent advances in data-to-text generation have led to the use of large-scale datasets and neural network models which are trained end-to-end, without explicitly modeling what to say and in what order. In this work, we present a neural network architecture which incorporates content selection and planning without sacrificing end-to-end training. We decompose the generation task into two stages. Given a corpus of data records (paired with descriptive documents), we first generate a content plan highlighting which information should be mentioned and in which order and then generate the document while taking the content plan into account. Automatic and human-based evaluation experiments show that our model outperforms strong baselines improving the state-of-the-art on the recently released RotoWire dataset.

PDF Abstract

Datasets


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Data-to-Text Generation RotoWire Neural Content Planning + conditional copy BLEU 16.50 # 4
Data-to-Text Generation RotoWire (Content Ordering) Neural Content Planning + conditional copy DLD 18.58% # 2
BLEU 16.50 # 2
Data-to-Text Generation Rotowire (Content Selection) Neural Content Planning + conditional copy Precision 34.18% # 3
Recall 51.22% # 3
Data-to-Text Generation RotoWire (Relation Generation) Neural Content Planning + conditional copy count 34.28 # 3
Precision 87.47% # 5

Methods


No methods listed for this paper. Add relevant methods here