Dialogue Generation

229 papers with code • 13 benchmarks • 31 datasets

Dialogue generation is the task of "understanding" natural language inputs - within natural language processing in order to produce output. The systems are usually intended for conversing with humans, for instance back and forth dialogue with a conversation agent like a chatbot. Some example benchmarks for this task (see others such as Natural Language Understanding) include FusedChat and Ubuntu DIalogue Corpus (UDC). Models can be evaluated via metrics such as BLEU, ROUGE, and METEOR albeit with challenges in terms of weak correlation with human judgement, that may be addressed by new ones like UnSupervised and Reference-free (USR) and Metric for automatic Unreferenced dialog evaluation (MaUde).

Libraries

Use these libraries to find Dialogue Generation models and implementations
2 papers
1,688

Most implemented papers

ERNIE-GEN: An Enhanced Multi-Flow Pre-training and Fine-tuning Framework for Natural Language Generation

PaddlePaddle/ERNIE 26 Jan 2020

Current pre-training works in natural language generation pay little attention to the problem of exposure bias on downstream tasks.

PanGu-$α$: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation

mindspore-ai/models 26 Apr 2021

To enhance the generalization ability of PanGu-$\alpha$, we collect 1. 1TB high-quality Chinese data from a wide range of domains to pretrain the model.

Relevance of Unsupervised Metrics in Task-Oriented Dialogue for Evaluating Natural Language Generation

Maluuba/nlg-eval ICLR 2018

However, previous work in dialogue response generation has shown that these metrics do not correlate strongly with human judgment in the non task-oriented dialogue setting.

End-to-end Adversarial Learning for Generative Conversational Agents

oswaldoludwig/Seq2seq-Chatbot-for-Keras 28 Nov 2017

This paper presents a new adversarial learning method for generative conversational agents (GCA) besides a new model of GCA.

DP-GAN: Diversity-Promoting Generative Adversarial Network for Generating Informative and Diversified Text

lancopku/DPGAN 5 Feb 2018

Existing text generation methods tend to produce repeated and "boring" expressions.

Personalized Dialogue Generation with Diversified Traits

songhaoyu/BoB 28 Jan 2019

In this paper, we investigate the problem of incorporating explicit personality traits in dialogue generation to deliver personalized dialogues.

Rethinking Action Spaces for Reinforcement Learning in End-to-end Dialog Agents with Latent Variable Models

snakeztc/NeuralDialog-LaRL NAACL 2019

Defining action spaces for conversational agents and optimizing their decision-making process with reinforcement learning is an enduring challenge.

Text Generation from Knowledge Graphs with Graph Transformers

rikdz/GraphWriter NAACL 2019

Generating texts which express complex ideas spanning multiple sentences requires a structured representation of their content (document plan), but these representations are prohibitively expensive to manually produce.

PLATO: Pre-trained Dialogue Generation Model with Discrete Latent Variable

PaddlePaddle/Research ACL 2020

Pre-training models have been proved effective for a wide range of natural language processing tasks.

PLATO-XL: Exploring the Large-scale Pre-training of Dialogue Generation

PaddlePaddle/Knover 20 Sep 2021

To explore the limit of dialogue generation pre-training, we present the models of PLATO-XL with up to 11 billion parameters, trained on both Chinese and English social media conversations.