Stochastic Weight Matrix-based Regularization Methods for Deep Neural Networks

September 26, 2019 Β· Entered Twilight Β· πŸ› International Conference on Machine Learning, Optimization, and Data Science

πŸŒ… TWILIGHT: Old Age
Predates the code-sharing era β€” a pioneer of its time

"Last commit was 6.0 years ago (β‰₯5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, LICENSE, README.md, datasets, results, src

Authors Patrik Reizinger, BÑlint Gyires-Tóth arXiv ID 1909.11977 Category cs.LG: Machine Learning Cross-listed cs.NE Citations 2 Venue International Conference on Machine Learning, Optimization, and Data Science Repository https://github.com/rpatrik96/lod-wmm-2019 ⭐ 3 Last Checked 1 month ago
Abstract
The aim of this paper is to introduce two widely applicable regularization methods based on the direct modification of weight matrices. The first method, Weight Reinitialization, utilizes a simplified Bayesian assumption with partially resetting a sparse subset of the parameters. The second one, Weight Shuffling, introduces an entropy- and weight distribution-invariant non-white noise to the parameters. The latter can also be interpreted as an ensemble approach. The proposed methods are evaluated on benchmark datasets, such as MNIST, CIFAR-10 or the JSB Chorales database, and also on time series modeling tasks. We report gains both regarding performance and entropy of the analyzed networks. We also made our code available as a GitHub repository (https://github.com/rpatrik96/lod-wmm-2019).
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Machine Learning