We present a new neural architecture for wide-coverage Natural Language Understanding in Spoken Dialogue Systems.
Plato has been designed to be easy to understand and debug and is agnostic to the underlying learning frameworks that train each component.
Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights.
This paper summarises the experimental setup and results of the first shared task on end-to-end (E2E) natural language generation (NLG) in spoken dialogue systems.
Ranked #4 on Data-to-Text Generation on E2E NLG Challenge
We present a novel natural language generation system for spoken dialogue systems capable of entraining (adapting) to users' way of speaking, providing contextually appropriate responses.
In this paper, we describe our methodology for creating the query reformulation extension to the dialog corpus, and present an initial set of experiments to establish a baseline for the CQR task.
Natural language generation (NLG) is a critical component of spoken dialogue and it has a significant impact both on usability and perceived quality.
Natural language generation (NLG) is a critical component in spoken dialogue systems.