no code implementations • 1 Jan 2021 • Rodolfo Palma, Alvaro Soto, Luis Martí, Nayat Sanchez-pi
We introduce two temporal attention modules which can be plugged into traditional memory augmented recurrent neural networks to improve their performance in natural language processing tasks.