Nonparametric Modern Hopfield Models

April 05, 2024 Β· Entered Twilight Β· πŸ› International Conference on Machine Learning

πŸ’€ TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: .DS_Store, DATASET.md, LICENSE, README.md, bit_pattern.ipynb, bit_pattern.py, configs, data, datasets, deeprc, efficiency_main.py, env_setup.ssh, gitignore, layers.py, layers2.py, mnist_mil_main.py, models.py, plot.py, real_world_mil.py, real_world_mil_main.py, ref, requirements.txt, results, retrieval_main.py, scripts, time_series_main.py, trainers, uni_var.py, utils

Authors Jerry Yao-Chieh Hu, Bo-Yu Chen, Dennis Wu, Feng Ruan, Han Liu arXiv ID 2404.03900 Category stat.ML: Machine Learning (Stat) Cross-listed cs.AI, cs.LG, cs.NE Citations 22 Venue International Conference on Machine Learning Repository https://github.com/MAGICS-LAB/NonparametricHopfield ⭐ 10 Last Checked 1 month ago
Abstract
We present a nonparametric interpretation for deep learning compatible modern Hopfield models and utilize this new perspective to debut efficient variants. Our key contribution stems from interpreting the memory storage and retrieval processes in modern Hopfield models as a nonparametric regression problem subject to a set of query-memory pairs. Interestingly, our framework not only recovers the known results from the original dense modern Hopfield model but also fills the void in the literature regarding efficient modern Hopfield models, by introducing \textit{sparse-structured} modern Hopfield models with sub-quadratic complexity. We establish that this sparse model inherits the appealing theoretical properties of its dense analogue -- connection with transformer attention, fixed point convergence and exponential memory capacity. Additionally, we showcase the versatility of our framework by constructing a family of modern Hopfield models as extensions, including linear, random masked, top-$K$ and positive random feature modern Hopfield models. Empirically, we validate our framework in both synthetic and realistic settings for memory retrieval and learning tasks.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Machine Learning (Stat)

R.I.P. πŸ‘» Ghosted

Graph Attention Networks

Petar VeličkoviΔ‡, Guillem Cucurull, ... (+4 more)

stat.ML πŸ› ICLR πŸ“š 24.7K cites 8 years ago
R.I.P. πŸ‘» Ghosted

Layer Normalization

Jimmy Lei Ba, Jamie Ryan Kiros, Geoffrey E. Hinton

stat.ML πŸ› arXiv πŸ“š 12.0K cites 9 years ago