Conditional Text Generation

27 papers with code • 1 benchmarks • 4 datasets

The task of generating text according to some pre-specified conditioning (e.g. topic or sentiment or constraint)

Most implemented papers

Pragmatically Informative Text Generation

sIncerass/prag_generation NAACL 2019

We improve the informativeness of models for conditional text generation using techniques from computational pragmatics.

Unifying Vision-and-Language Tasks via Text Generation

j-min/VL-T5 4 Feb 2021

On 7 popular vision-and-language benchmarks, including visual question answering, referring expression comprehension, visual commonsense reasoning, most of which have been previously modeled as discriminative tasks, our generative approach (with a single unified architecture) reaches comparable performance to recent task-specific state-of-the-art vision-and-language models.

Extract, Denoise and Enforce: Evaluating and Improving Concept Preservation for Text-to-Text Generation

morningmoni/ede EMNLP 2021

In this paper, we present a systematic analysis that studies whether current seq2seq models, especially pre-trained language models, are good enough for preserving important input concepts and to what extent explicitly guiding generation with the concepts as lexical constraints is beneficial.

BanglaNLG and BanglaT5: Benchmarks and Resources for Evaluating Low-Resource Natural Language Generation in Bangla

csebuetnlp/banglanlg 23 May 2022

This work presents BanglaNLG, a comprehensive benchmark for evaluating natural language generation (NLG) models in Bangla, a widely spoken yet low-resource language.

The Dialog Must Go On: Improving Visual Dialog via Generative Self-Training

gicheonkang/gst-visdial CVPR 2023

As a result, GST scales the amount of training data up to an order of magnitude that of VisDial (1. 2M to 12. 9M QA data).

GENIUS: Sketch-based Language Model Pre-training via Extreme and Selective Masking for Text Generation and Augmentation

beyondguo/genius 18 Nov 2022

We introduce GENIUS: a conditional text generation model using sketches as input, which can fill in the missing contexts for a given sketch (key information consisting of textual spans, phrases, or words, concatenated by mask tokens).

Generating Text through Adversarial Training using Skip-Thought Vectors

afrozas/skip-thought-gan NAACL 2019

Attempts have been made to utilize GANs with word embeddings for text generation.

Encoder-Agnostic Adaptation for Conditional Language Generation

harvardnlp/encoder-agnostic-adaptation 19 Aug 2019

Large pretrained language models have changed the way researchers approach discriminative natural language understanding tasks, leading to the dominance of approaches that adapt a pretrained model for arbitrary downstream tasks.

Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders

WHUIR/PPVAE ACL 2020

Conditional Text Generation has drawn much attention as a topic of Natural Language Generation (NLG) which provides the possibility for humans to control the properties of generated contents.

ToTTo: A Controlled Table-To-Text Generation Dataset

google-research-datasets/ToTTo EMNLP 2020

We present ToTTo, an open-domain English table-to-text dataset with over 120, 000 training examples that proposes a controlled generation task: given a Wikipedia table and a set of highlighted table cells, produce a one-sentence description.