Browse SoTA > Natural Language Processing > Text Summarization > Abstractive Text Summarization

Abstractive Text Summarization

83 papers with code · Natural Language Processing
Subtask of Text Summarization

Abstractive Text Summarization is the task of generating a short and concise summary that captures the salient ideas of the source text. The generated summaries potentially contain new phrases and sentences that may not appear in the source text.

Source: Generative Adversarial Network for Abstractive Text Summarization

Benchmarks

Greatest papers with code

PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization

ICML 2020 huggingface/transformers

Recent work pre-training Transformers with self-supervised objectives on large text corpora has shown great success when fine-tuned on downstream NLP tasks including text summarization.

ABSTRACTIVE TEXT SUMMARIZATION

Pay Less Attention with Lightweight and Dynamic Convolutions

ICLR 2019 pytorch/fairseq

We predict separate convolution kernels based solely on the current time-step in order to determine the importance of context elements.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION

Classical Structured Prediction Losses for Sequence to Sequence Learning

NAACL 2018 pytorch/fairseq

There has been much recent work on training neural attention models at the sequence-level using either reinforcement learning-style methods or by optimizing the beam.

ABSTRACTIVE TEXT SUMMARIZATION MACHINE TRANSLATION STRUCTURED PREDICTION

ERNIE-GEN: An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation

26 Jan 2020PaddlePaddle/ERNIE

Current pre-training works in natural language generation pay little attention to the problem of exposure bias on downstream tasks.

 Ranked #1 on Text Summarization on GigaWord-10k (using extra training data)

ABSTRACTIVE TEXT SUMMARIZATION DIALOGUE GENERATION GENERATIVE QUESTION ANSWERING QUESTION GENERATION

Get To The Point: Summarization with Pointer-Generator Networks

ACL 2017 abisee/pointer-generator

Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text).

ABSTRACTIVE TEXT SUMMARIZATION

Neural Abstractive Text Summarization with Sequence-to-Sequence Models

5 Dec 2018shibing624/pycorrector

As part of this survey, we also develop an open source library, namely, Neural Abstractive Text Summarizer (NATS) toolkit, for the abstractive text summarization.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION

UniLMv2: Pseudo-Masked Language Models for Unified Language Model Pre-Training

28 Feb 2020microsoft/unilm

We propose to pre-train a unified language model for both autoencoding and partially autoregressive language modeling tasks using a novel training procedure, referred to as a pseudo-masked language model (PMLM).

Ranked #3 on Question Generation on SQuAD1.1 (using extra training data)

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING NATURAL LANGUAGE UNDERSTANDING QUESTION GENERATION

Abstractive Summarization of Spoken andWritten Instructions with BERT

KDD Converse 2020 nlpyang/PreSumm

Summarization of speech is a difficult problem due to the spontaneity of the flow, disfluencies, and other issues that are not usually encountered in written texts.

ABSTRACTIVE TEXT SUMMARIZATION TRANSFER LEARNING