Natural Language Understanding

671 papers with code • 6 benchmarks • 69 datasets

Natural Language Understanding is an important field of Natural Language Processing which contains various tasks such as text classification, natural language inference and story comprehension. Applications enabled by natural language understanding range from question answering to automated reasoning.

Source: Find a Reasonable Ending for Stories: Does Logic Relation Help the Story Cloze Test?

Libraries

Use these libraries to find Natural Language Understanding models and implementations
11 papers
126,108
7 papers
2,208
6 papers
1,959
See all 10 libraries.

Most implemented papers

MASSIVE: A 1M-Example Multilingual Natural Language Understanding Dataset with 51 Typologically-Diverse Languages

alexa/massive 18 Apr 2022

We present the MASSIVE dataset--Multilingual Amazon Slu resource package (SLURP) for Slot-filling, Intent classification, and Virtual assistant Evaluation.

Mind the GAP: A Balanced Corpus of Gendered Ambiguous Pronouns

google-research-datasets/gap-coreference TACL 2018

Coreference resolution is an important task for natural language understanding, and the resolution of ambiguous pronouns a longstanding challenge.

BARThez: a Skilled Pretrained French Sequence-to-Sequence Model

moussaKam/BARThez EMNLP 2021

We show BARThez to be very competitive with state-of-the-art BERT-based French language models such as CamemBERT and FlauBERT.

I-BERT: Integer-only BERT Quantization

huggingface/transformers 5 Jan 2021

Transformer based models, like BERT and RoBERTa, have achieved state-of-the-art results in many Natural Language Processing tasks.

PanGu-$α$: Large-scale Autoregressive Pretrained Chinese Language Models with Auto-parallel Computation

mindspore-ai/models 26 Apr 2021

To enhance the generalization ability of PanGu-$\alpha$, we collect 1. 1TB high-quality Chinese data from a wide range of domains to pretrain the model.

Neural Semantic Encoders

tsendeemts/nse EACL 2017

We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.

DisSent: Sentence Representation Learning from Explicit Discourse Relations

facebookresearch/InferSent 12 Oct 2017

Learning effective representations of sentences is one of the core missions of natural language understanding.

Visual Re-ranking with Natural Language Understanding for Text Spotting

ahmedssabir/Visual-Semantic-Relatedness-with-Word-Embedding 29 Oct 2018

We propose a post-processing approach to improve scene text recognition accuracy by using occurrence probabilities of words (unigram language model), and the semantic correlation between scene and text.

Joint Slot Filling and Intent Detection via Capsule Neural Networks

czhang99/Capsule-NLU ACL 2019

Being able to recognize words as slots and detect the intent of an utterance has been a keen issue in natural language understanding.