daBNN: A Super Fast Inference Framework for Binary Neural Networks on ARM devices

August 16, 2019 ยท Entered Twilight ยท ๐Ÿ› ACM Multimedia

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 6.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .clang-format, .daq_pm, .gitignore, .gitmodules, CMakeLists.txt, LICENSE, README.md, README_CN.md, benchmark, binaries, ci, cmake, common, dabnn, docs, images, tests, third_party, tools

Authors Jianhao Zhang, Yingwei Pan, Ting Yao, He Zhao, Tao Mei arXiv ID 1908.05858 Category cs.CV: Computer Vision Cross-listed cs.MM, eess.IV Citations 69 Venue ACM Multimedia Repository https://github.com/JDAI-CV/dabnn โญ 778 Last Checked 1 month ago
Abstract
It is always well believed that Binary Neural Networks (BNNs) could drastically accelerate the inference efficiency by replacing the arithmetic operations in float-valued Deep Neural Networks (DNNs) with bit-wise operations. Nevertheless, there has not been open-source implementation in support of this idea on low-end ARM devices (e.g., mobile phones and embedded devices). In this work, we propose daBNN --- a super fast inference framework that implements BNNs on ARM devices. Several speed-up and memory refinement strategies for bit-packing, binarized convolution, and memory layout are uniquely devised to enhance inference efficiency. Compared to the recent open-source BNN inference framework, BMXNet, our daBNN is $7\times$$\sim$$23\times$ faster on a single binary convolution, and about $6\times$ faster on Bi-Real Net 18 (a BNN variant of ResNet-18). The daBNN is a BSD-licensed inference framework, and its source code, sample projects and pre-trained models are available on-line: https://github.com/JDAI-CV/dabnn.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computer Vision