A Taxonomy for Neural Memory Networks

1 May 2018 Ying Ma Jose Principe

In this paper, a taxonomy for memory networks is proposed based on their memory organization. The taxonomy includes all the popular memory networks: vanilla recurrent neural network (RNN), long short term memory (LSTM ), neural stack and neural Turing machine and their variants... (read more)

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Softmax
Output Functions
Sigmoid Activation
Activation Functions
Tanh Activation
Activation Functions
Neural Turing Machine
Working Memory Models
Memory Network
Working Memory Models
Location-based Attention
Attention Mechanisms
Content-based Attention
Attention Mechanisms
LSTM
Recurrent Neural Networks