Spoken Language Understanding
118 papers with code • 5 benchmarks • 14 datasets
Libraries
Use these libraries to find Spoken Language Understanding models and implementationsDatasets
Most implemented papers
Dynamic Time-Aware Attention to Speaker Roles and Contexts for Spoken Language Understanding
However, the previous model only paid attention to the content in history utterances without considering their temporal information and speaker roles.
Towards end-to-end spoken language understanding
Spoken language understanding system is traditionally designed as a pipeline of a number of components.
Spoken SQuAD: A Study of Mitigating the Impact of Speech Recognition Errors on Listening Comprehension
Reading comprehension has been widely studied.
How Time Matters: Learning Time-Decay Attention for Contextual Spoken Language Understanding in Dialogues
Spoken language understanding (SLU) is an essential component in conversational systems.
ISO-Standard Domain-Independent Dialogue Act Tagging for Conversational Agents
Dialogue Act (DA) tagging is crucial for spoken language understanding systems, as it provides a general representation of speakers' intents, not bound to a particular dialogue system.
Discourse-Wizard: Discovering Deep Discourse Structure in your Conversation with RNNs
Spoken language understanding is one of the key factors in a dialogue system, and a context in a conversation plays an important role to understand the current utterance.
Fully Statistical Neural Belief Tracking
This paper proposes an improvement to the existing data-driven Neural Belief Tracking (NBT) framework for Dialogue State Tracking (DST).
Improving Slot Filling in Spoken Language Understanding with Joint Pointer and Attention
We present a generative neural network model for slot filling based on a sequence-to-sequence (Seq2Seq) model together with a pointer network, in the situation where only sentence-level slot annotations are available in the spoken dialogue data.
FlowQA: Grasping Flow in History for Conversational Machine Comprehension
Conversational machine comprehension requires the understanding of the conversation history, such as previous question/answer pairs, the document context, and the current question.
Unsupervised Transfer Learning for Spoken Language Understanding in Intelligent Agents
Our findings suggest unsupervised pre-training on a large corpora of unlabeled utterances leads to significantly better SLU performance compared to training from scratch and it can even outperform conventional supervised transfer.