Slot-Gated Modeling for Joint Slot Filling and Intent Prediction

Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this paper proposes a slot gate that focuses on learning the relationship between intent and slot attention vectors in order to obtain better semantic frame results by the global optimization. The experiments show that our proposed model significantly improves sentence-level semantic frame accuracy with 4.2{\%} and 1.9{\%} relative improvement compared to the attentional model on benchmark ATIS and Snips datasets respectively

PDF Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Intent Detection SNIPS Slot-Gated BLSTM with Attension Intent Accuracy 97.00 # 9
Slot F1 Score 88.80 # 9

Results from Other Papers


Task Dataset Model Metric Name Metric Value Rank Source Paper Compare
Intent Detection ATIS Slot-Gated BLSTM with Attension Accuracy 94.10 # 15
F1 95.20 # 8

Methods


No methods listed for this paper. Add relevant methods here