Natural Language Understanding
671 papers with code • 6 benchmarks • 69 datasets
Natural Language Understanding is an important field of Natural Language Processing which contains various tasks such as text classification, natural language inference and story comprehension. Applications enabled by natural language understanding range from question answering to automated reasoning.
Source: Find a Reasonable Ending for Stories: Does Logic Relation Help the Story Cloze Test?
Libraries
Use these libraries to find Natural Language Understanding models and implementationsMost implemented papers
MASSIVE: A 1M-Example Multilingual Natural Language Understanding Dataset with 51 Typologically-Diverse Languages
We present the MASSIVE dataset--Multilingual Amazon Slu resource package (SLURP) for Slot-filling, Intent classification, and Virtual assistant Evaluation.
Mind the GAP: A Balanced Corpus of Gendered Ambiguous Pronouns
Coreference resolution is an important task for natural language understanding, and the resolution of ambiguous pronouns a longstanding challenge.
Collaborative Multi-Agent Dialogue Model Training Via Reinforcement Learning
and their own objectives, and can only interact via natural language they generate.
BARThez: a Skilled Pretrained French Sequence-to-Sequence Model
We show BARThez to be very competitive with state-of-the-art BERT-based French language models such as CamemBERT and FlauBERT.
I-BERT: Integer-only BERT Quantization
Transformer based models, like BERT and RoBERTa, have achieved state-of-the-art results in many Natural Language Processing tasks.
PanGu-$α$: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation
To enhance the generalization ability of PanGu-$\alpha$, we collect 1. 1TB high-quality Chinese data from a wide range of domains to pretrain the model.
Neural Semantic Encoders
We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.
DisSent: Sentence Representation Learning from Explicit Discourse Relations
Learning effective representations of sentences is one of the core missions of natural language understanding.
Visual Re-ranking with Natural Language Understanding for Text Spotting
We propose a post-processing approach to improve scene text recognition accuracy by using occurrence probabilities of words (unigram language model), and the semantic correlation between scene and text.
Joint Slot Filling and Intent Detection via Capsule Neural Networks
Being able to recognize words as slots and detect the intent of an utterance has been a keen issue in natural language understanding.