Hopfield Networks is All You Need

16 Jul 2020Hubert RamsauerBernhard SchäflJohannes LehnerPhilipp SeidlMichael WidrichLukas GruberMarkus HolzleitnerMilena PavlovićGeir Kjetil SandveVictor GreiffDavid KreilMichael KoppGünter KlambauerJohannes BrandstetterSepp Hochreiter

We show that the transformer attention mechanism is the update rule of a modern Hopfield network with continuous states. This new Hopfield network can store exponentially (with the dimension) many patterns, converges with one update, and has exponentially small retrieval errors... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper