Methods > General > Working Memory Models

End-To-End Memory Network

Introduced by Sukhbaatar et al. in End-To-End Memory Networks

An End-to-End Memory Network is a neural network with a recurrent attention model over a possibly large external memory. The architecture is a form of Memory Network, but unlike the model in that work, it is trained end-to-end, and hence requires significantly less supervision during training. It can also be seen as an extension of RNNsearch to the case where multiple computational steps (hops) are performed per output symbol.

The model takes a discrete set of inputs $x_{1}, \dots, x_{n}$ that are to be stored in the memory, a query $q$, and outputs an answer $a$. Each of the $x_{i}$, $q$, and $a$ contains symbols coming from a dictionary with $V$ words. The model writes all $x$ to the memory up to a fixed buffer size, and then finds a continuous representation for the $x$ and $q$. The continuous representation is then processed via multiple hops to output $a$.

Source: End-To-End Memory Networks

Latest Papers

PAPER DATE
Automatic Stance Detection Using End-to-End Memory Networks
Mitra MohtaramiRamy BalyJames GlassPreslav NakovLluis MarquezAlessandro Moschitti
2018-04-20
End-To-End Memory Networks
| Sainbayar SukhbaatarArthur SzlamJason WestonRob Fergus
2015-03-31

Tasks

TASK PAPERS SHARE
Stance Detection 1 33.33%
Language Modelling 1 33.33%
Question Answering 1 33.33%

Components

COMPONENT TYPE
Softmax
Output Functions

Categories