Concept Pointer Network for Abstractive Summarization

IJCNLP 2019  ·  Wang Wenbo, Gao Yang, Huang Heyan, Zhou Yuxiang ·

A quality abstractive summary should not only copy salient source texts as summaries but should also tend to generate new conceptual words to express concrete details. Inspired by the popular pointer generator sequence-to-sequence model, this paper presents a concept pointer network for improving these aspects of abstractive summarization. The network leverages knowledge-based, context-aware conceptualizations to derive an extended set of candidate concepts. The model then points to the most appropriate choice using both the concept set and original source text. This joint approach generates abstractive summaries with higher-level semantic concepts. The training model is also optimized in a way that adapts to different data, which is based on a novel method of distantly-supervised learning guided by reference summaries and testing set. Overall, the proposed approach provides statistically significant improvements over several state-of-the-art models on both the DUC-2004 and Gigaword datasets. A human evaluation of the model's abstractive abilities also supports the quality of the summaries produced within this framework.

PDF Abstract IJCNLP 2019 PDF IJCNLP 2019 Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Text Summarization GigaWord Concept pointer+RL ROUGE-1 38.02 # 20
ROUGE-2 16.97 # 34
ROUGE-L 35.43 # 20
Text Summarization GigaWord Concept pointer+DS ROUGE-1 37.01 # 25
ROUGE-2 17.1 # 33
ROUGE-L 34.87 # 22

Methods