R.I.P.
๐ป
Ghosted
Efficient and Accurate Conversion of Spiking Neural Network with Burst Spikes
April 28, 2022 ยท Entered Twilight ยท ๐ International Joint Conference on Artificial Intelligence
Repo contents: CIFAR100_VGG16.py, Conversion_error.jpg, LICENSE, README.md, converted_CIFAR100_vgg.py, utils.py
Authors
Yang Li, Yi Zeng
arXiv ID
2204.13271
Category
cs.NE: Neural & Evolutionary
Cross-listed
cs.AI
Citations
67
Venue
International Joint Conference on Artificial Intelligence
Repository
https://github.com/Brain-Inspired-Cognitive-Engine/Conversion_Burst
โญ 20
Last Checked
1 month ago
Abstract
Spiking neural network (SNN), as a brain-inspired energy-efficient neural network, has attracted the interest of researchers. While the training of spiking neural networks is still an open problem. One effective way is to map the weight of trained ANN to SNN to achieve high reasoning ability. However, the converted spiking neural network often suffers from performance degradation and a considerable time delay. To speed up the inference process and obtain higher accuracy, we theoretically analyze the errors in the conversion process from three perspectives: the differences between IF and ReLU, time dimension, and pooling operation. We propose a neuron model for releasing burst spikes, a cheap but highly efficient method to solve residual information. In addition, Lateral Inhibition Pooling (LIPooling) is proposed to solve the inaccuracy problem caused by MaxPooling in the conversion process. Experimental results on CIFAR and ImageNet demonstrate that our algorithm is efficient and accurate. For example, our method can ensure nearly lossless conversion of SNN and only use about 1/10 (less than 100) simulation time under 0.693$\times$ energy consumption of the typical method. Our code is available at https://github.com/Brain-Inspired-Cognitive-Engine/Conversion_Burst.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Neural & Evolutionary
R.I.P.
๐ป
Ghosted
Progressive Growing of GANs for Improved Quality, Stability, and Variation
R.I.P.
๐ป
Ghosted
Learning both Weights and Connections for Efficient Neural Networks
R.I.P.
๐ป
Ghosted
LSTM: A Search Space Odyssey
R.I.P.
๐ป
Ghosted
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
R.I.P.
๐ป
Ghosted