Robert Farrell, Rajarshi Das, et al.
AAAI-SS 2010
Associative Memories (AMs) like Hopfield Networks are elegant models for describing fully recurrent neural networks that store and retrieve information. Recent theoretical advances have revealed deep connections between AMs and modern AI architectures such as Transformers and diffusion models, opening new possibilities for interpreting and designing neural networks. This tutorial provides an approachable introduction to AMs through three components: theoretical foundations from classical to Dense Associative Memories, hands-on implementations using a universal Lagrangian framework for designing AMs (including coding Transformers as AMs), and practical applications connecting AMs to kernel methods, density estimation, and deep clustering.
We provide a technical, hands-on level tutorial for the general attendee of AAAI i.e., graduate students, industry researchers, faculty, and developers. However, we expect our tutorial to additionally appeal to physicists, neuroscientists, and other researchers interested in brain-inspired models of computation. From a student’s perspective, the ordered priorities of this tutorial are to: (1) build intuition, (2) understand the fundamental tooling needed to work with these systems, and (3) engage with simple proofs that highlight their elegant behavior.
Robert Farrell, Rajarshi Das, et al.
AAAI-SS 2010
Masataro Asai, Stephen Wissow
AAAI 2026
Chen-chia Chang, Wan-hsuan Lin, et al.
ICML 2025
Gang Liu, Michael Sun, et al.
ICLR 2025