Dynamic Knowledge Graph-based Dialogue Generation with Improved Adversarial Meta-Learning

19 Apr 2020  ·  Hongcai Xu, Junpeng Bao, Gaojie Zhang ·

Knowledge graph-based dialogue systems are capable of generating more informative responses and can implement sophisticated reasoning mechanisms. However, these models do not take into account the sparseness and incompleteness of knowledge graph (KG)and current dialogue models cannot be applied to dynamic KG. This paper proposes a dynamic Knowledge graph-based dialogue generation method with improved adversarial Meta-Learning (KDAD). KDAD formulates dynamic knowledge triples as a problem of adversarial attack and incorporates the objective of quickly adapting to dynamic knowledge-aware dialogue generation. We train a knowledge graph-based dialog model with improved ADML using minimal training samples. The model can initialize the parameters and adapt to previous unseen knowledge so that training can be quickly completed based on only a few knowledge triples. We show that our model significantly outperforms other baselines. We evaluate and demonstrate that our method adapts extremely fast and well to dynamic knowledge graph-based dialogue generation.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here