Unsupervised Sentence Compression

4 papers with code • 0 benchmarks • 0 datasets

Producing a shorter sentence by removing redundant information, preserving the grammatically and the important content of the original sentence without supervision. (Source: nlpprogress.com)

Most implemented papers

Unsupervised Sentence Compression using Denoising Auto-Encoders

zphang/usc_dae CONLL 2018

In sentence compression, the task of shortening sentences while retaining the original meaning, models tend to be trained on large corpora containing pairs of verbose and compressed sentences.

SEQ^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

cbaziotis/seq3 7 Apr 2019

The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.

SEQ\^3: Differentiable Sequence-to-Sequence-to-Sequence Autoencoder for Unsupervised Abstractive Sentence Compression

cbaziotis/seq3 NAACL 2019

The proposed model does not require parallel text-summary pairs, achieving promising results in unsupervised sentence compression on benchmark datasets.

Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning

complementizer/rl-sentence-compression ACL 2022

Sentence compression reduces the length of text by removing non-essential content while preserving important facts and grammaticality.