Training Competitive Binary Neural Networks from Scratch

December 05, 2018 ยท Entered Twilight ยท ๐Ÿ› arXiv.org

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 5.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .clang-tidy, .codecov.yml, .gitattributes, .github, .gitignore, .gitlab-ci.yml, .gitmodules, .mxnet_root, .travis.yml, 3rdparty, CHANGELOG.md, CMakeLists.txt, CODEOWNERS, CONTRIBUTORS.md, DISCLAIMER, KEYS, LICENSE, MKLDNN_README.md, Makefile, NEWS.md, NOTICE, R-package, README.md, amalgamation, appveyor.yml, benchmark, ci, cmake, contrib, cpp-package, dev_menu.py, docker, docs, example, include, julia, make, matlab, mkldnn.mk, perl-package, plugin, python, readthedocs.yml, scala-package, setup-utils, snap.python, snapcraft.yaml, src, tests, tools

Authors Joseph Bethge, Marvin Bornstein, Adrian Loy, Haojin Yang, Christoph Meinel arXiv ID 1812.01965 Category cs.LG: Machine Learning Cross-listed cs.CV, stat.ML Citations 33 Venue arXiv.org Repository https://github.com/hpi-xnor/BMXNet-v2 โญ 232 Last Checked 1 month ago
Abstract
Convolutional neural networks have achieved astonishing results in different application areas. Various methods that allow us to use these models on mobile and embedded devices have been proposed. Especially binary neural networks are a promising approach for devices with low computational power. However, training accurate binary models from scratch remains a challenge. Previous work often uses prior knowledge from full-precision models and complex training strategies. In our work, we focus on increasing the performance of binary neural networks without such prior knowledge and a much simpler training strategy. In our experiments we show that we are able to achieve state-of-the-art results on standard benchmark datasets. Further, to the best of our knowledge, we are the first to successfully adopt a network architecture with dense connections for binary networks, which lets us improve the state-of-the-art even further.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning