Nonparametric Modern Hopfield Models
April 05, 2024 Β· Entered Twilight Β· π International Conference on Machine Learning
Repo contents: .DS_Store, DATASET.md, LICENSE, README.md, bit_pattern.ipynb, bit_pattern.py, configs, data, datasets, deeprc, efficiency_main.py, env_setup.ssh, gitignore, layers.py, layers2.py, mnist_mil_main.py, models.py, plot.py, real_world_mil.py, real_world_mil_main.py, ref, requirements.txt, results, retrieval_main.py, scripts, time_series_main.py, trainers, uni_var.py, utils
Authors
Jerry Yao-Chieh Hu, Bo-Yu Chen, Dennis Wu, Feng Ruan, Han Liu
arXiv ID
2404.03900
Category
stat.ML: Machine Learning (Stat)
Cross-listed
cs.AI,
cs.LG,
cs.NE
Citations
22
Venue
International Conference on Machine Learning
Repository
https://github.com/MAGICS-LAB/NonparametricHopfield
β 10
Last Checked
1 month ago
Abstract
We present a nonparametric interpretation for deep learning compatible modern Hopfield models and utilize this new perspective to debut efficient variants. Our key contribution stems from interpreting the memory storage and retrieval processes in modern Hopfield models as a nonparametric regression problem subject to a set of query-memory pairs. Interestingly, our framework not only recovers the known results from the original dense modern Hopfield model but also fills the void in the literature regarding efficient modern Hopfield models, by introducing \textit{sparse-structured} modern Hopfield models with sub-quadratic complexity. We establish that this sparse model inherits the appealing theoretical properties of its dense analogue -- connection with transformer attention, fixed point convergence and exponential memory capacity. Additionally, we showcase the versatility of our framework by constructing a family of modern Hopfield models as extensions, including linear, random masked, top-$K$ and positive random feature modern Hopfield models. Empirically, we validate our framework in both synthetic and realistic settings for memory retrieval and learning tasks.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning (Stat)
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Distilling the Knowledge in a Neural Network
R.I.P.
π»
Ghosted
Layer Normalization
R.I.P.
π»
Ghosted
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
R.I.P.
π»
Ghosted
Domain-Adversarial Training of Neural Networks
R.I.P.
π»
Ghosted