About

Benchmarks

No evaluation results yet. Help compare methods by submit evaluation metrics.

Greatest papers with code

Hopfield Networks is All You Need

ICLR 2021 ml-jku/hopfield-layers

The new update rule is equivalent to the attention mechanism used in transformers.

IMMUNE REPERTOIRE CLASSIFICATION MULTIPLE INSTANCE LEARNING

Modern Hopfield Networks and Attention for Immune Repertoire Classification

NeurIPS 2020 ml-jku/DeepRC

We show that the attention mechanism of transformer architectures is actually the update rule of modern Hopfield networks that can store exponentially many patterns.

IMMUNE REPERTOIRE CLASSIFICATION INTERPRETABLE MACHINE LEARNING MULTIPLE INSTANCE LEARNING