A Mathematical Formalization of Hierarchical Temporal Memory's Spatial Pooler
January 22, 2016 Β· Entered Twilight Β· π Frontiers in Robotics and AI
"Last commit was 8.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitattributes, .gitignore, LICENSE.txt, README.md, dev, docs, epydoc_config.txt, requirements.txt, setup.py, src
Authors
James Mnatzaganian, Ernest FokouΓ©, Dhireesha Kudithipudi
arXiv ID
1601.06116
Category
stat.ML: Machine Learning (Stat)
Cross-listed
cs.LG,
q-bio.NC
Citations
38
Venue
Frontiers in Robotics and AI
Repository
https://github.com/tehtechguy/mHTM
β 25
Last Checked
1 month ago
Abstract
Hierarchical temporal memory (HTM) is an emerging machine learning algorithm, with the potential to provide a means to perform predictions on spatiotemporal data. The algorithm, inspired by the neocortex, currently does not have a comprehensive mathematical framework. This work brings together all aspects of the spatial pooler (SP), a critical learning component in HTM, under a single unifying framework. The primary learning mechanism is explored, where a maximum likelihood estimator for determining the degree of permanence update is proposed. The boosting mechanisms are studied and found to be only relevant during the initial few iterations of the network. Observations are made relating HTM to well-known algorithms such as competitive learning and attribute bagging. Methods are provided for using the SP for classification as well as dimensionality reduction. Empirical evidence verifies that given the proper parameterizations, the SP may be used for feature learning.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning (Stat)
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Distilling the Knowledge in a Neural Network
R.I.P.
π»
Ghosted
Layer Normalization
R.I.P.
π»
Ghosted
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
R.I.P.
π»
Ghosted
Domain-Adversarial Training of Neural Networks
R.I.P.
π»
Ghosted