Browse SoTA > Natural Language Processing > Text Summarization

Text Summarization

90 papers with code · Natural Language Processing

Benchmarks

Greatest papers with code

A Neural Attention Model for Abstractive Sentence Summarization

EMNLP 2015 tensorflow/models

Summarization based on text extraction is inherently limited, but generation-style abstractive methods have proven challenging to build.

SENTENCE SUMMARIZATION

PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization

ICML 2020 huggingface/transformers

Recent work pre-training Transformers with self-supervised objectives on large text corpora has shown great success when fine-tuned on downstream NLP tasks including text summarization.

ABSTRACTIVE TEXT SUMMARIZATION

Levenshtein Transformer

NeurIPS 2019 pytorch/fairseq

We further confirm the flexibility of our model by showing a Levenshtein Transformer trained by machine translation can straightforwardly be used for automatic post-editing.

AUTOMATIC POST-EDITING MACHINE TRANSLATION TEXT SUMMARIZATION

ERNIE-GEN: An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation

26 Jan 2020PaddlePaddle/ERNIE

Current pre-training works in natural language generation pay little attention to the problem of exposure bias on downstream tasks.

 Ranked #1 on Text Summarization on GigaWord-10k (using extra training data)

ABSTRACTIVE TEXT SUMMARIZATION DIALOGUE GENERATION GENERATIVE QUESTION ANSWERING QUESTION GENERATION

Get To The Point: Summarization with Pointer-Generator Networks

ACL 2017 abisee/pointer-generator

Neural sequence-to-sequence models have provided a viable new approach for abstractive text summarization (meaning they are not restricted to simply selecting and rearranging passages from the original text).

ABSTRACTIVE TEXT SUMMARIZATION

Neural Abstractive Text Summarization with Sequence-to-Sequence Models

5 Dec 2018shibing624/pycorrector

As part of this survey, we also develop an open source library, namely, Neural Abstractive Text Summarizer (NATS) toolkit, for the abstractive text summarization.

ABSTRACTIVE TEXT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION

MASS: Masked Sequence to Sequence Pre-training for Language Generation

7 May 2019microsoft/MASS

Pre-training and fine-tuning, e. g., BERT, have achieved great success in language understanding by transferring knowledge from rich-resource pre-training task to the low/zero-resource downstream tasks.

CONVERSATIONAL RESPONSE GENERATION TEXT GENERATION TEXT SUMMARIZATION UNSUPERVISED MACHINE TRANSLATION

Abstractive Summarization of Spoken andWritten Instructions with BERT

KDD Converse 2020 nlpyang/PreSumm

Summarization of speech is a difficult problem due to the spontaneity of the flow, disfluencies, and other issues that are not usually encountered in written texts.

ABSTRACTIVE TEXT SUMMARIZATION TRANSFER LEARNING