Paraphrase Generation

68 papers with code • 3 benchmarks • 16 datasets

Paraphrase Generation involves transforming a natural language sentence to a new sentence, that has the same semantic meaning but a different syntactic or lexical surface form.

Most implemented papers

How Large Language Models are Transforming Machine-Paraphrased Plagiarism

jpwahle/emnlp22-transforming 7 Oct 2022

The recent success of large language models for text generation poses a severe threat to academic integrity, as plagiarists can generate realistic paraphrases indistinguishable from original work.

Learning Semantic Sentence Embeddings using Sequential Pair-wise Discriminator

dev-chauhan/PQG-pytorch COLING 2018

One way to ensure this is by adding constraints for true paraphrase embeddings to be close and unrelated paraphrase candidate sentence embeddings to be far.

Paraphrase Generation with Latent Bag of Words

FranxYao/Deep-Generative-Models-for-Natural-Language-Processing NeurIPS 2019

Inspired by variational autoencoders with discrete latent structures, in this work, we propose a latent bag of words (BOW) model for paraphrase generation.

Neural Syntactic Preordering for Controlled Paraphrase Generation

tagoyal/sow-reap-paraphrasing ACL 2020

Paraphrasing natural language sentences is a multifaceted process: it might involve replacing individual words or short phrases, local rearrangement of content, or high-level restructuring like topicalization or passivization.

Syntax-guided Controlled Generation of Paraphrases

malllabiisc/SGCP TACL 2020

In these methods, syntactic-guidance is sourced from a separate exemplar sentence.

ChatGPT to Replace Crowdsourcing of Paraphrases for Intent Classification: Higher Diversity and Comparable Model Robustness

kinit-sk/crowd-vs-gpt-intent-class 22 May 2023

The emergence of generative large language models (LLMs) raises the question: what will be its impact on crowdsourcing?

SLPL SHROOM at SemEval2024 Task 06: A comprehensive study on models ability to detect hallucination

sharif-slpl/se-2024-task-06-shroom 7 Apr 2024

Language models, particularly generative models, are susceptible to hallucinations, generating outputs that contradict factual knowledge or the source text.

Neural Paraphrase Generation with Stacked Residual LSTM Networks

pushpendughosh/Stock-market-forecasting COLING 2016

To the best of our knowledge, this work is the first to explore deep learning models for paraphrase generation.

A Deep Generative Framework for Paraphrase Generation

arvind385801/paraphrasegen 15 Sep 2017

In this paper, we address the problem of generating paraphrases automatically.