Paraphrase Generation

68 papers with code • 3 benchmarks • 16 datasets

Paraphrase Generation involves transforming a natural language sentence to a new sentence, that has the same semantic meaning but a different syntactic or lexical surface form.

Latest papers with no code

Neural Language Taskonomy: Which NLP Tasks are the most Predictive of fMRI Brain Activity?

no code yet • NAACL 2022

Several popular Transformer based language models have been found to be successful for text-driven brain encoding.

Quick Starting Dialog Systems with Paraphrase Generation

no code yet • 6 Apr 2022

Acquiring training data to improve the robustness of dialog systems can be a painstakingly long process.

Entailment Relation Aware Paraphrase Generation

no code yet • 20 Mar 2022

We introduce a new task of entailment relation aware paraphrase generation which aims at generating a paraphrase conforming to a given entailment relation (e. g. equivalent, forward entailing, or reverse entailing) with respect to a given input.

IndicNLG Benchmark: Multilingual Datasets for Diverse NLG Tasks in Indic Languages

no code yet • 10 Mar 2022

Natural Language Generation (NLG) for non-English languages is hampered by the scarcity of datasets in these languages.

Novelty Controlled Paraphrase Generation with Retrieval Augmented Conditional Prompt Tuning

no code yet • 1 Feb 2022

Paraphrase generation is a fundamental and long-standing task in natural language processing.

Continual Learning for Seq2Seq Generations with Transformer Calibration

no code yet • ACL ARR January 2022

We model the attention in the transformer as a calibrated unit in a general formulation, where the attention calibration could give benefits to balance the stability and plasticity of continual learning algorithms through influencing both their forward inference path and backward optimization path.

TD-ConE: An Information-Theoretic Approach to Assessing Parallel Text Generation Data

no code yet • ACL ARR January 2022

Existing data assessment methods are mainly for classification-based datasets and limited for use in natural language generation (NLG) datasets.

Sampling from Discrete Energy-Based Models with Quality/Efficiency Trade-offs

no code yet • 10 Dec 2021

We show that we can sample from such EBMs with arbitrary precision at the cost of sampling efficiency.

SHCT: A Successively Hierarchical Conditional Transformer for Controllable Paraphrase Generation

no code yet • ACL ARR November 2021

To address the problem of absorbing flexible attributes, we apply a hierarchical structure to our SHCT which enables the framework to couple the CVAE latent variables with encoder layer hidden states successively.

Improving Paraphrase Generation models with machine translation generated pre-training

no code yet • ACL ARR November 2021

Paraphrase generation is a fundamental and longstanding problem in the Natural Language Processing field.