A Well-Composed Text is Half Done! Semantic Composition Sampling for Diverse Conditional Generation

ACL ARR November 2021  ·  Anonymous ·

We propose Composition Sampling, a simple but effective method to generate higher quality diverse outputs for conditional generation tasks, compared to previous stochastic decoding strategies. It builds on recently proposed planning-based neural generation models that are trained to first create a composition of the output using an entity chain and then continue to generate conditioned on the entity chain and the input \cite{frost}. Our approach avoids text degeneration by first sampling a composition in the form of an entity chain and then using beam search to generate the best possible text grounded to the entity chain. Experiments on CNN/DailyMail and XSum using a variety of automatic metrics and human-based evaluation demonstrate that Composition Sampling is currently the best available decoding strategy for generating diverse meaningful summaries. We further outperform state-of-the-art approaches for question generation in terms of BLEU.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here