Paraphrase Generation
69 papers with code • 3 benchmarks • 16 datasets
Paraphrase Generation involves transforming a natural language sentence to a new sentence, that has the same semantic meaning but a different syntactic or lexical surface form.
Datasets
Latest papers with no code
Are We Evaluating Paraphrase Generation Accurately?
The evaluation of paraphrase generation (PG) is a complex task and currently lacks a complete picture of the criteria and metrics.
Power Norm Based Lifelong Learning for Paraphrase Generations
Seq2seq language generation models are trained with multiple domains in a continue learning manner, where the data from each domain being observed in an online fashion.
GCPG: A General Framework for Controllable Paraphrase Generation
Under GCPG, we reconstruct commonly adopted lexical condition (i. e., Keywords) and syntactical conditions (i. e., Part-Of-Speech sequence, Constituent Tree, Masked Template and Sentential Exemplar) and study the combination of the two types.
Simulated annealing for optimization of graphs and sequences
The key idea is to integrate powerful neural networks into metaheuristics (e. g., simulated annealing, SA) to restrict the search space in discrete optimization.
Discovering Latent Network Topology in Contextualized Representations with Randomized Dynamic Programming
We use RDP to analyze the representation space of pretrained language models, discovering a large-scale latent network in a fully unsupervised way.
Learning to Selectively Learn for Weakly-supervised Paraphrase Generation
In this work, we go beyond the existing paradigms and propose a novel approach to generate high-quality paraphrases with weak supervision data.
Paraphrase Generation as Unsupervised Machine Translation
Then based on the paraphrase pairs produced by these UMT models, a unified surrogate model can be trained to serve as the final \sts model to generate paraphrases, which can be directly used for test in the unsupervised setup, or be finetuned on labeled datasets in the supervised setup.
ConRPG: Paraphrase Generation using Contexts as Regularizer
A long-standing issue with paraphrase generation is how to obtain reliable supervision signals.
SPMoE: Generate Multiple Pattern-Aware Outputs with Sparse Pattern Mixture of Experts
Each one-to-one mapping is associated with a conditional generation pattern and is modeled with an expert in SPMoE.
Edit Distance Based Curriculum Learning for Paraphrase Generation
Curriculum learning has improved the quality of neural machine translation, where only source-side features are considered in the metrics to determine the difficulty of translation.