Learning in Memristive Neural Network Architectures using Analog Backpropagation Circuits

August 31, 2018 ยท Declared Dead ยท ๐Ÿ› IEEE Transactions on Circuits and Systems Part 1: Regular Papers

๐Ÿ‘ป CAUSE OF DEATH: Ghosted
No code link whatsoever

"No code URL or promise found in abstract"

Evidence collected by the PWNC Scanner

Authors Olga Krestinskaya, Khaled Nabil Salama, Alex Pappachen James arXiv ID 1808.10631 Category cs.ET: Emerging Technologies Cross-listed cs.AI Citations 125 Venue IEEE Transactions on Circuits and Systems Part 1: Regular Papers Last Checked 1 month ago
Abstract
The on-chip implementation of learning algorithms would speed-up the training of neural networks in crossbar arrays. The circuit level design and implementation of backpropagation algorithm using gradient descent operation for neural network architectures is an open problem. In this paper, we proposed the analog backpropagation learning circuits for various memristive learning architectures, such as Deep Neural Network (DNN), Binary Neural Network (BNN), Multiple Neural Network (MNN), Hierarchical Temporal Memory (HTM) and Long-Short Term Memory (LSTM). The circuit design and verification is done using TSMC 180nm CMOS process models, and TiO2 based memristor models. The application level validations of the system are done using XOR problem, MNIST character and Yale face image databases
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Emerging Technologies

Died the same way โ€” ๐Ÿ‘ป Ghosted