Spoken Language Understanding

118 papers with code • 5 benchmarks • 14 datasets

This task has no description! Would you like to contribute one?

Libraries

Use these libraries to find Spoken Language Understanding models and implementations

Most implemented papers

Dynamic Time-Aware Attention to Speaker Roles and Contexts for Spoken Language Understanding

MiuLab/Time-SLU 30 Sep 2017

However, the previous model only paid attention to the content in history utterances without considering their temporal information and speaker roles.

Towards end-to-end spoken language understanding

dmitriy-serdyuk/arxiv2kindle 23 Feb 2018

Spoken language understanding system is traditionally designed as a pipeline of a number of components.

ISO-Standard Domain-Independent Dialogue Act Tagging for Conversational Agents

ColingPaper2018/DialogueAct-Tagger COLING 2018

Dialogue Act (DA) tagging is crucial for spoken language understanding systems, as it provides a general representation of speakers' intents, not bound to a particular dialogue system.

Discourse-Wizard: Discovering Deep Discourse Structure in your Conversation with RNNs

bothe/dialogue-act-recognition 29 Jun 2018

Spoken language understanding is one of the key factors in a dialogue system, and a context in a conversation plays an important role to understand the current utterance.

Fully Statistical Neural Belief Tracking

nmrksic/neural-belief-tracker ACL 2018

This paper proposes an improvement to the existing data-driven Neural Belief Tracking (NBT) framework for Dialogue State Tracking (DST).

Improving Slot Filling in Spoken Language Understanding with Joint Pointer and Attention

huagong-chung/PtrNet_SLU ACL 2018

We present a generative neural network model for slot filling based on a sequence-to-sequence (Seq2Seq) model together with a pointer network, in the situation where only sentence-level slot annotations are available in the spoken dialogue data.

FlowQA: Grasping Flow in History for Conversational Machine Comprehension

momohuang/FlowQA ICLR 2019

Conversational machine comprehension requires the understanding of the conversation history, such as previous question/answer pairs, the document context, and the current question.

Unsupervised Transfer Learning for Spoken Language Understanding in Intelligent Agents

sxjscience/GluonNLP-Slot-Filling 13 Nov 2018

Our findings suggest unsupervised pre-training on a large corpora of unlabeled utterances leads to significantly better SLU performance compared to training from scratch and it can even outperform conventional supervised transfer.