Logic Design of Neural Networks for High-Throughput and Low-Power Applications
September 19, 2023 ยท Declared Dead ยท ๐ Asia and South Pacific Design Automation Conference
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Kangwei Xu, Grace Li Zhang, Ulf Schlichtmann, Bing Li
arXiv ID
2309.10510
Category
eess.SY: Systems & Control (EE)
Cross-listed
cs.NE
Citations
10
Venue
Asia and South Pacific Design Automation Conference
Last Checked
2 months ago
Abstract
Neural networks (NNs) have been successfully deployed in various fields. In NNs, a large number of multiplyaccumulate (MAC) operations need to be performed. Most existing digital hardware platforms rely on parallel MAC units to accelerate these MAC operations. However, under a given area constraint, the number of MAC units in such platforms is limited, so MAC units have to be reused to perform MAC operations in a neural network. Accordingly, the throughput in generating classification results is not high, which prevents the application of traditional hardware platforms in extreme-throughput scenarios. Besides, the power consumption of such platforms is also high, mainly due to data movement. To overcome this challenge, in this paper, we propose to flatten and implement all the operations at neurons, e.g., MAC and ReLU, in a neural network with their corresponding logic circuits. To improve the throughput and reduce the power consumption of such logic designs, the weight values are embedded into the MAC units to simplify the logic, which can reduce the delay of the MAC units and the power consumption incurred by weight movement. The retiming technique is further used to improve the throughput of the logic circuits for neural networks. In addition, we propose a hardware-aware training method to reduce the area of logic designs of neural networks. Experimental results demonstrate that the proposed logic designs can achieve high throughput and low power consumption for several high-throughput applications.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Systems & Control (EE)
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Incremental Gradient, Subgradient, and Proximal Methods for Convex Optimization: A Survey
R.I.P.
๐ป
Ghosted
Wireless Network Design for Control Systems: A Survey
R.I.P.
๐ป
Ghosted
Learning-based Model Predictive Control for Safe Exploration
R.I.P.
๐ป
Ghosted
Safety-Critical Model Predictive Control with Discrete-Time Control Barrier Function
R.I.P.
๐ป
Ghosted
Novel Multidimensional Models of Opinion Dynamics in Social Networks
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted