Paraphrase Generation
68 papers with code • 3 benchmarks • 16 datasets
Paraphrase Generation involves transforming a natural language sentence to a new sentence, that has the same semantic meaning but a different syntactic or lexical surface form.
Datasets
Latest papers with no code
Neural Language Taskonomy: Which NLP Tasks are the most Predictive of fMRI Brain Activity?
Several popular Transformer based language models have been found to be successful for text-driven brain encoding.
Quick Starting Dialog Systems with Paraphrase Generation
Acquiring training data to improve the robustness of dialog systems can be a painstakingly long process.
Entailment Relation Aware Paraphrase Generation
We introduce a new task of entailment relation aware paraphrase generation which aims at generating a paraphrase conforming to a given entailment relation (e. g. equivalent, forward entailing, or reverse entailing) with respect to a given input.
IndicNLG Benchmark: Multilingual Datasets for Diverse NLG Tasks in Indic Languages
Natural Language Generation (NLG) for non-English languages is hampered by the scarcity of datasets in these languages.
Novelty Controlled Paraphrase Generation with Retrieval Augmented Conditional Prompt Tuning
Paraphrase generation is a fundamental and long-standing task in natural language processing.
Continual Learning for Seq2Seq Generations with Transformer Calibration
We model the attention in the transformer as a calibrated unit in a general formulation, where the attention calibration could give benefits to balance the stability and plasticity of continual learning algorithms through influencing both their forward inference path and backward optimization path.
TD-ConE: An Information-Theoretic Approach to Assessing Parallel Text Generation Data
Existing data assessment methods are mainly for classification-based datasets and limited for use in natural language generation (NLG) datasets.
Sampling from Discrete Energy-Based Models with Quality/Efficiency Trade-offs
We show that we can sample from such EBMs with arbitrary precision at the cost of sampling efficiency.
SHCT: A Successively Hierarchical Conditional Transformer for Controllable Paraphrase Generation
To address the problem of absorbing flexible attributes, we apply a hierarchical structure to our SHCT which enables the framework to couple the CVAE latent variables with encoder layer hidden states successively.
Improving Paraphrase Generation models with machine translation generated pre-training
Paraphrase generation is a fundamental and longstanding problem in the Natural Language Processing field.