Multi-domain Dialogue State Tracking
29 papers with code • 6 benchmarks • 2 datasets
Libraries
Use these libraries to find Multi-domain Dialogue State Tracking models and implementationsMost implemented papers
Scalable Multi-Domain Dialogue State Tracking
We introduce a novel framework for state tracking which is independent of the slot value set, and represent the dialogue state as a distribution over a set of values of interest (candidate set) derived from the dialogue history or knowledge.
Toward Scalable Neural Dialogue State Tracking Model
The latency in the current neural based dialogue state tracking models prohibits them from being used efficiently for deployment in production systems, albeit their highly accurate performance.
Scalable and Accurate Dialogue State Tracking via Hierarchical Sequence Generation
Experiments on both the multi-domain and the single domain dialogue state tracking dataset show that our model not only scales easily with the increasing number of pre-defined domains and slots but also reaches the state-of-the-art performance.
Find or Classify? Dual Strategy for Slot-Value Predictions on Multi-Domain Dialog State Tracking
Dialog state tracking (DST) is a core component in task-oriented dialog systems.
Non-Autoregressive Dialog State Tracking
Recent efforts in Dialogue State Tracking (DST) for task-oriented dialogues have progressed toward open-vocabulary or generation-based approaches where the models can generate slot value candidates from the dialogue history itself.
Zero-Shot Transfer Learning with Synthesized Data for Multi-Domain Dialogue State Tracking
We show that data augmentation through synthesized data can improve the accuracy of zero-shot learning for both the TRADE model and the BERT-based SUMBT model on the MultiWOZ 2. 1 dataset.
A Simple Language Model for Task-Oriented Dialogue
Task-oriented dialogue is often decomposed into three tasks: understanding user input, deciding actions, and generating a response.
Parallel Interactive Networks for Multi-Domain Dialogue State Generation
In this study, we argue that the incorporation of these dependencies is crucial for the design of MDST and propose Parallel Interactive Networks (PIN) to model these dependencies.
MinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
In this paper, we propose Minimalist Transfer Learning (MinTL) to simplify the system design process of task-oriented dialogue systems and alleviate the over-dependency on annotated data.
DialoGLUE: A Natural Language Understanding Benchmark for Task-Oriented Dialogue
A long-standing goal of task-oriented dialogue research is the ability to flexibly adapt dialogue models to new domains.