Browse SoTA > Natural Language Processing > Linguistic Acceptability

Linguistic Acceptability

14 papers with code · Natural Language Processing

Benchmarks

Latest papers with code

Learning to Encode Position for Transformer with Continuous Dynamical Model

13 Mar 2020xuanqing94/FLOATER

The main reason is that position information among input units is not inherently encoded, i. e., the models are permutation equivalent; this problem justifies why all of the existing models are accompanied by a sinusoidal encoding/embedding layer at the input.

LINGUISTIC ACCEPTABILITY MACHINE TRANSLATION SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS

5
13 Mar 2020

Masked Language Model Scoring

ACL 2020 awslabs/mlm-scoring

Instead, we evaluate MLMs out of the box via their pseudo-log-likelihood scores (PLLs), which are computed by masking tokens one by one.

DOMAIN ADAPTATION LANGUAGE MODELLING LINGUISTIC ACCEPTABILITY

56
31 Oct 2019

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

arXiv 2019 google-research/text-to-text-transfer-transformer

Transfer learning, where a model is first pre-trained on a data-rich task before being fine-tuned on a downstream task, has emerged as a powerful technique in natural language processing (NLP).

 Ranked #1 on Semantic Textual Similarity on STS Benchmark (using extra training data)

LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS TEXT CLASSIFICATION TRANSFER LEARNING

2,731
23 Oct 2019

DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

NeurIPS 2019 huggingface/transformers

As Transfer Learning from large-scale pre-trained models becomes more prevalent in Natural Language Processing (NLP), operating these large models in on-the-edge and/or under constrained computational training or inference budgets remains challenging.

LANGUAGE MODELLING LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS TRANSFER LEARNING

31,991
02 Oct 2019

TinyBERT: Distilling BERT for Natural Language Understanding

23 Sep 2019huawei-noah/Pretrained-Language-Model

To accelerate inference and reduce model size while maintaining accuracy, we firstly propose a novel transformer distillation method that is a specially designed knowledge distillation (KD) method for transformer-based models.

LANGUAGE MODELLING LINGUISTIC ACCEPTABILITY NATURAL LANGUAGE INFERENCE NATURAL LANGUAGE UNDERSTANDING PARAPHRASE IDENTIFICATION QUESTION ANSWERING SEMANTIC TEXTUAL SIMILARITY SENTIMENT ANALYSIS

1,106
23 Sep 2019