We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders.
Ranked #11 on Question Answering on WikiQA
The dominant approach for many NLP tasks are recurrent neural networks, in particular LSTMs, and convolutional neural networks.
Ranked #14 on Text Classification on DBpedia
We study the topmost weight matrix of neural network language models.
We describe EmoBank, a corpus of 10k English sentences balancing multiple genres, which we annotated with dimensional emotion metadata in the Valence-Arousal-Dominance (VAD) representation format.
In this work, we investigate several neural network architectures for fine-grained entity type classification.
We present a new parallel corpus, JHU FLuency-Extended GUG corpus (JFLEG) for developing and evaluating grammatical error correction (GEC).
Our goal is to combine the rich multistep inference of symbolic logical reasoning with the generalization capabilities of neural networks.