Investigating Generalization in Neural Networks under Optimally Evolved Training Perturbations

March 14, 2020 ยท Declared Dead ยท ๐Ÿ› IEEE International Conference on Acoustics, Speech, and Signal Processing

๐Ÿ“œ CAUSE OF DEATH: Death by README
Repo has only a README

Repo contents: README.md

Authors Subhajit Chaudhury, Toshihiko Yamasaki arXiv ID 2003.06646 Category cs.LG: Machine Learning Cross-listed cs.CR, cs.CV, eess.IV, stat.ML Citations 3 Venue IEEE International Conference on Acoustics, Speech, and Signal Processing Repository https://github.com/subhajitchaudhury/evo-shift โญ 2 Last Checked 1 month ago
Abstract
In this paper, we study the generalization properties of neural networks under input perturbations and show that minimal training data corruption by a few pixel modifications can cause drastic overfitting. We propose an evolutionary algorithm to search for optimal pixel perturbations using novel cost function inspired from literature in domain adaptation that explicitly maximizes the generalization gap and domain divergence between clean and corrupted images. Our method outperforms previous pixel-based data distribution shift methods on state-of-the-art Convolutional Neural Networks (CNNs) architectures. Interestingly, we find that the choice of optimization plays an important role in generalization robustness due to the empirical observation that SGD is resilient to such training data corruption unlike adaptive optimization techniques (ADAM). Our source code is available at https://github.com/subhajitchaudhury/evo-shift.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ“œ Death by README