Paragraph-level Attention-Aware Inference for Multi-Document Neural Abstractive Summarization

15 Sep 2020 Ye Ma Lu Zong

Inspired by Google's Neural Machine Translation (NMT) that models the one-to-one alignment in translation tasks with an uniform attention distribution during the inference, this study proposes an attention-aware inference algorithm for Neural Abstractive Summarization (NAS) to regulate generated summaries to attend to source contents with the optimal coverage. Unlike NMT, NAS is not based on one-to-one transformation... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper