Text Generation

246 papers with code · Natural Language Processing

Text generation is the task of generating text with the goal of appearing indistinguishable to human-written text.

( Image credit: Adversarial Ranking for Language Generation )

Leaderboards

Greatest papers with code

MaskGAN: Better Text Generation via Filling in the______

23 Jan 2018tensorflow/models

Additionally, these models are typically trained via maxi- mum likelihood and teacher forcing.

MULTIVARIATE TIME SERIES IMPUTATION TEXT GENERATION

Plug and Play Language Models: A Simple Approach to Controlled Text Generation

ICLR 2020 huggingface/transformers

Large transformer-based language models (LMs) trained on huge text corpora have shown unparalleled generation capabilities.

LANGUAGE MODELLING TEXT GENERATION

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

ACL 2020 huggingface/transformers

We evaluate a number of noising approaches, finding the best performance by both randomly shuffling the order of the original sentences and using a novel in-filling scheme, where spans of text are replaced with a single mask token.

#9 best model for Question Answering on SQuAD1.1 dev (F1 metric)

DENOISING MACHINE TRANSLATION NATURAL LANGUAGE INFERENCE QUESTION ANSWERING TEXT GENERATION

HuggingFace's Transformers: State-of-the-art Natural Language Processing

9 Oct 2019huggingface/transformers

In this paper, we present HuggingFace's Transformers library, a library for state-of-the-art NLP, making these developments available to the community by gathering state-of-the-art general-purpose pretrained models under a unified API together with an ecosystem of libraries, examples, tutorials and scripts targeting many downstream NLP tasks.

TEXT GENERATION TRANSFER LEARNING

CTRL: A Conditional Transformer Language Model for Controllable Generation

Preprint 2019 huggingface/transformers

Large-scale language models show promising text generation capabilities, but users cannot easily control particular aspects of the generated text.

LANGUAGE MODELLING TEXT GENERATION

Language Models are Unsupervised Multitask Learners

Preprint 2019 huggingface/transformers

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets.

 SOTA for Language Modelling on Text8 (using extra training data)

COMMON SENSE REASONING DOCUMENT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION QUESTION ANSWERING READING COMPREHENSION TEXT GENERATION

Generating Sequences With Recurrent Neural Networks

4 Aug 2013karpathy/char-rnn

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time.

TEXT GENERATION

Neural Assistant: Joint Action Prediction, Response Generation, and Latent Knowledge Reasoning

31 Oct 2019tensorflow/tensor2tensor

In this paper, we develop Neural Assistant: a single neural network model that takes conversation history and an external knowledge source as input and jointly produces both text response and action to be taken by the system as output.

TEXT GENERATION

fairseq: A Fast, Extensible Toolkit for Sequence Modeling

NAACL 2019 facebookresearch/fairseq-py

fairseq is an open-source sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling, and other text generation tasks.

LANGUAGE MODELLING TEXT GENERATION

Mixture Models for Diverse Machine Translation: Tricks of the Trade

20 Feb 2019pytorch/fairseq

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

MACHINE TRANSLATION TEXT GENERATION