Browse SoTA > Sentence Completion

Sentence Completion

5 papers with code ·

Leaderboards

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Greatest papers with code

Language Models are Few-Shot Learners

28 May 2020openai/gpt-3

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

 SOTA for Language Modelling on Penn Treebank (Word Level) (using extra training data)

COMMON SENSE REASONING COREFERENCE RESOLUTION DOMAIN ADAPTATION FEW-SHOT LEARNING LANGUAGE MODELLING NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SENTENCE COMPLETION UNSUPERVISED MACHINE TRANSLATION WORD SENSE DISAMBIGUATION

Top-down Tree Long Short-Term Memory Networks

NAACL 2016 XingxingZhang/td-treelstm

Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have been successfully applied to a variety of sequence modeling tasks.

DEPENDENCY PARSING SENTENCE COMPLETION

Learning Semantically and Additively Compositional Distributional Representations

ACL 2016 tianran/vecdcs

This paper connects a vector-based composition model to a formal semantics, the Dependency-based Compositional Semantics (DCS).

RELATION CLASSIFICATION SENTENCE COMPLETION

Recurrent Memory Networks for Language Modeling

NAACL 2016 simonjisu/NMT

In this paper, we propose Recurrent Memory Network (RMN), a novel RNN architecture, that not only amplifies the power of RNN but also facilitates our understanding of its internal functioning and allows us to discover underlying patterns in data.

LANGUAGE MODELLING SENTENCE COMPLETION

A Deep Architecture for Semantic Matching with Multiple Positional Sentence Representations

26 Nov 2015jastfkjg/semantic-matching

Our model has several advantages: (1) By using Bi-LSTM, rich context of the whole sentence is leveraged to capture the contextualized local information in each positional sentence representation; (2) By matching with multiple positional sentence representations, it is flexible to aggregate different important contextualized local information in a sentence to support the matching; (3) Experiments on different tasks such as question answering and sentence completion demonstrate the superiority of our model.

INFORMATION RETRIEVAL QUESTION ANSWERING SENTENCE COMPLETION