Long Short-Term Memory-Networks for Machine Reading

In this paper we address the question of how to render sequence-level networks better at handling structured input. We propose a machine reading simulator which processes text incrementally from left to right and performs shallow reasoning with memory and attention... (read more)

PDF Abstract EMNLP 2016 PDF EMNLP 2016 Abstract

Datasets


Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Natural Language Inference SNLI 450D LSTMN with deep attention fusion % Test Accuracy 86.3 # 30
% Train Accuracy 88.5 # 42
Parameters 3.4m # 3
Natural Language Inference SNLI 300D LSTMN with deep attention fusion % Test Accuracy 85.7 # 34
% Train Accuracy 87.3 # 43
Parameters 1.7m # 3

Methods used in the Paper


METHOD TYPE
Memory Network
Working Memory Models