Preview, Attend and Review: Schema-Aware Curriculum Learning for Multi-Domain Dialog State Tracking

1 Jun 2021  ·  Yinpei Dai, Hangyu Li, Yongbin Li, Jian Sun, Fei Huang, Luo Si, Xiaodan Zhu ·

Existing dialog state tracking (DST) models are trained with dialog data in a random order, neglecting rich structural information in a dataset. In this paper, we propose to use curriculum learning (CL) to better leverage both the curriculum structure and schema structure for task-oriented dialogs. Specifically, we propose a model-agnostic framework called Schema-aware Curriculum Learning for Dialog State Tracking (SaCLog), which consists of a preview module that pre-trains a DST model with schema information, a curriculum module that optimizes the model with CL, and a review module that augments mispredicted data to reinforce the CL training. We show that our proposed approach improves DST performance over both a transformer-based and RNN-based DST model (TripPy and TRADE) and achieves new state-of-the-art results on WOZ2.0 and MultiWOZ2.1.

PDF Abstract

Datasets


Results from the Paper


 Ranked #1 on Multi-domain Dialogue State Tracking on MULTIWOZ 2.1 (using extra training data)

     Get a GitHub badge
Task Dataset Model Metric Name Metric Value Global Rank Uses Extra
Training Data
Benchmark
Multi-domain Dialogue State Tracking MULTIWOZ 2.1 TripPy+SaCLog Joint Acc 60.61 # 1

Methods


No methods listed for this paper. Add relevant methods here