Knowledge Graph is in Rescue: Task Oriented Dialogue System for Response Generation without NLU and DM

ACL ARR November 2021  ·  Anonymous ·

Natural language understanding (NLU) and dialogue management (DM) are the standard prerequisites for response generation in a task-oriented dialogue system. In the existing literature, NLU and DM have been tackled as two independent tasks, requiring separate labeled data. Besides this problem of additional data requirements, NLU and DM also introduce errors in the pipe-lined processing of dialogue. Direct generation of responses from the user utterances without using any intermediate NLU and DM modules is, thus, a worthy goal. To accomplish this, the model should be able to (implicitly) understand the intent of the user, fetch the appropriate data from the knowledge base, (implicitly) decide an action by looking at the fetched data and conversation history, and finally, generate the response. In this work, we build an end-to-end dialogue generation system that does not require NLU and DM components or their associated labels in the data. We propose an effective technique based on the pre-trained GPT-2 and Graph Convolution Network (GCN) that takes conversation history and knowledge graph as input and produces appropriate responses. Experiments on three benchmark datasets viz. MultiWOZ, InCar Assist, and CamRest show that our proposed model achieves performance comparable to state-of-the-art systems without using the NLU and DM labels. Human evaluation conducted on the outputs also confirms that our proposed model generates highly fluent and contextually relevant responses.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods