Continuous Dropout

November 28, 2019 ยท Entered Twilight ยท ๐Ÿ› IEEE Transactions on Neural Networks and Learning Systems

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 9.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .Doxyfile, .gitignore, .travis.yml, CMakeLists.txt, CONTRIBUTING.md, CONTRIBUTORS.md, INSTALL.md, LICENSE, Makefile, Makefile.config.example, README.md, build-windows, caffe.cloc, cmake, data, docker, docs, examples, include, matlab, models, python, scripts, src, test, tools

Authors Xu Shen, Xinmei Tian, Tongliang Liu, Fang Xu, Dacheng Tao arXiv ID 1911.12675 Category cs.CV: Computer Vision Cross-listed cs.LG, cs.NE, stat.ML Citations 71 Venue IEEE Transactions on Neural Networks and Learning Systems Repository https://github.com/jasonustc/caffe-multigpu/tree/dropout โญ 13 Last Checked 1 month ago
Abstract
Dropout has been proven to be an effective algorithm for training robust deep networks because of its ability to prevent overfitting by avoiding the co-adaptation of feature detectors. Current explanations of dropout include bagging, naive Bayes, regularization, and sex in evolution. According to the activation patterns of neurons in the human brain, when faced with different situations, the firing rates of neurons are random and continuous, not binary as current dropout does. Inspired by this phenomenon, we extend the traditional binary dropout to continuous dropout. On the one hand, continuous dropout is considerably closer to the activation characteristics of neurons in the human brain than traditional binary dropout. On the other hand, we demonstrate that continuous dropout has the property of avoiding the co-adaptation of feature detectors, which suggests that we can extract more independent feature detectors for model averaging in the test stage. We introduce the proposed continuous dropout to a feedforward neural network and comprehensively compare it with binary dropout, adaptive dropout, and DropConnect on MNIST, CIFAR-10, SVHN, NORB, and ILSVRC-12. Thorough experiments demonstrate that our method performs better in preventing the co-adaptation of feature detectors and improves test performance. The code is available at: https://github.com/jasonustc/caffe-multigpu/tree/dropout.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computer Vision