Extract, Select and Rewrite: A New Modular Summarization Method

ACL ARR January 2022  ·  Anonymous ·

Prior works on supervised summarization are mainly based on end-to-end models, leading to low modularity, unfaithfulness and low interpretability. To address this, we propose a new three-phase modular abstractive sentence summarization method.We split up the summarization problem explicitly into three stages, namely knowledge extraction, content selection and rewriting.We utilize multiple knowledge extractors to obtain relation triples from the text, learn a fine-tuned classifier to select content to be included in the summary and use a fine-tuned BART rewriter to rewrite the selected triples into a natural language summary.We find our model shows good modularity as the modules can be trained separately and on different datasets. The automatic and human evaluations demonstrate that our new method is competitive with state-of-the-art methods and more faithful than end-to-end baseline models.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods