Paraphrase Generation

69 papers with code • 3 benchmarks • 16 datasets

Paraphrase Generation involves transforming a natural language sentence to a new sentence, that has the same semantic meaning but a different syntactic or lexical surface form.

Latest papers with no code

Are We Evaluating Paraphrase Generation Accurately?

no code yet • ACL ARR November 2021

The evaluation of paraphrase generation (PG) is a complex task and currently lacks a complete picture of the criteria and metrics.

Power Norm Based Lifelong Learning for Paraphrase Generations

no code yet • ACL ARR November 2021

Seq2seq language generation models are trained with multiple domains in a continue learning manner, where the data from each domain being observed in an online fashion.

GCPG: A General Framework for Controllable Paraphrase Generation

no code yet • ACL ARR October 2021

Under GCPG, we reconstruct commonly adopted lexical condition (i. e., Keywords) and syntactical conditions (i. e., Part-Of-Speech sequence, Constituent Tree, Masked Template and Sentential Exemplar) and study the combination of the two types.

Simulated annealing for optimization of graphs and sequences

no code yet • 1 Oct 2021

The key idea is to integrate powerful neural networks into metaheuristics (e. g., simulated annealing, SA) to restrict the search space in discrete optimization.

Discovering Latent Network Topology in Contextualized Representations with Randomized Dynamic Programming

no code yet • 29 Sep 2021

We use RDP to analyze the representation space of pretrained language models, discovering a large-scale latent network in a fully unsupervised way.

Learning to Selectively Learn for Weakly-supervised Paraphrase Generation

no code yet • EMNLP 2021

In this work, we go beyond the existing paradigms and propose a novel approach to generate high-quality paraphrases with weak supervision data.

Paraphrase Generation as Unsupervised Machine Translation

no code yet • COLING 2022

Then based on the paraphrase pairs produced by these UMT models, a unified surrogate model can be trained to serve as the final \sts model to generate paraphrases, which can be directly used for test in the unsupervised setup, or be finetuned on labeled datasets in the supervised setup.

ConRPG: Paraphrase Generation using Contexts as Regularizer

no code yet • EMNLP 2021

A long-standing issue with paraphrase generation is how to obtain reliable supervision signals.

SPMoE: Generate Multiple Pattern-Aware Outputs with Sparse Pattern Mixture of Experts

no code yet • 17 Aug 2021

Each one-to-one mapping is associated with a conditional generation pattern and is modeled with an expert in SPMoE.

Edit Distance Based Curriculum Learning for Paraphrase Generation

no code yet • ACL 2021

Curriculum learning has improved the quality of neural machine translation, where only source-side features are considered in the metrics to determine the difficulty of translation.