Text Style Transfer

80 papers with code • 2 benchmarks • 6 datasets

Text Style Transfer is the task of controlling certain attributes of generated text. The state-of-the-art methods can be categorized into two main types which are used on parallel and non-parallel data. Methods on parallel data are typically supervised methods that use a neural sequence-to-sequence model with the encoder-decoder architecture. Methods on non-parallel data are usually unsupervised approaches using Disentanglement, Prototype Editing and Pseudo-Parallel Corpus Construction.

The popular benchmark for this task is the Yelp Review Dataset. Models are typically evaluated with the metrics of Sentiment Accuracy, BLEU, and PPL.

Libraries

Use these libraries to find Text Style Transfer models and implementations

Most implemented papers

Style Transfer from Non-Parallel Text by Cross-Alignment

shentianxiao/language-style-transfer NeurIPS 2017

We demonstrate the effectiveness of this cross-alignment method on three tasks: sentiment modification, decipherment of word substitution ciphers, and recovery of word order.

A Probabilistic Formulation of Unsupervised Text Style Transfer

cindyxinyiwang/deep-latent-sequence-model ICLR 2020

Across all style transfer tasks, our approach yields substantial gains over state-of-the-art non-generative baselines, including the state-of-the-art unsupervised machine translation techniques that our approach generalizes.

Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation

fastnlp/style-transformer ACL 2019

Disentangling the content and style in the latent space is prevalent in unpaired text style transfer.

Style Transfer Through Back-Translation

shrimai/Style-Transfer-Through-Back-Translation ACL 2018

We first learn a latent representation of the input sentence which is grounded in a language translation model in order to better preserve the meaning of the sentence while reducing stylistic properties.

Disentangled Representation Learning for Non-Parallel Text Style Transfer

vineetjohn/linguistic-style-transfer ACL 2019

This paper tackles the problem of disentangling the latent variables of style and content in language models.

Multiple-Attribute Text Style Transfer

martiansideofthemoon/style-transfer-paraphrase 1 Nov 2018

The dominant approach to unsupervised "style transfer" in text is based on the idea of learning a latent representation, which is independent of the attributes specifying its "style".

IMaT: Unsupervised Text Attribute Transfer via Iterative Matching and Translation

zhijing-jin/IMaT IJCNLP 2019

Text attribute transfer aims to automatically rewrite sentences such that they possess certain linguistic attributes, while simultaneously preserving their semantic content.

Educating Text Autoencoders: Latent Representation Guidance via Denoising

shentianxiao/text-autoencoders ICML 2020

We prove that this simple modification guides the latent space geometry of the resulting model by encouraging the encoder to map similar texts to similar latent representations.

IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation

gsarti/it5 7 Mar 2022

The T5 model and its unified text-to-text paradigm contributed in advancing the state-of-the-art for many natural language processing tasks.