R.I.P.
๐ป
Ghosted
NeuroEvoBench: Benchmarking Evolutionary Optimizers for Deep Learning Applications
November 04, 2023 ยท Entered Twilight ยท ๐ Neural Information Processing Systems
Repo contents: .github, .gitignore, LICENSE, MANIFEST.in, Readme.md, docs, examples, neuroevobench, setup.py, tests
Authors
Robert Tjarko Lange, Yujin Tang, Yingtao Tian
arXiv ID
2311.02394
Category
cs.NE: Neural & Evolutionary
Cross-listed
cs.LG
Citations
4
Venue
Neural Information Processing Systems
Repository
https://github.com/neuroevobench/neuroevobench
โญ 42
Last Checked
1 month ago
Abstract
Recently, the Deep Learning community has become interested in evolutionary optimization (EO) as a means to address hard optimization problems, e.g. meta-learning through long inner loop unrolls or optimizing non-differentiable operators. One core reason for this trend has been the recent innovation in hardware acceleration and compatible software - making distributed population evaluations much easier than before. Unlike for gradient descent-based methods though, there is a lack of hyperparameter understanding and best practices for EO - arguably due to severely less 'graduate student descent' and benchmarking being performed for EO methods. Additionally, classical benchmarks from the evolutionary community provide few practical insights for Deep Learning applications. This poses challenges for newcomers to hardware-accelerated EO and hinders significant adoption. Hence, we establish a new benchmark of EO methods (NeuroEvoBench) tailored toward Deep Learning applications and exhaustively evaluate traditional and meta-learned EO. We investigate core scientific questions including resource allocation, fitness shaping, normalization, regularization & scalability of EO. The benchmark is open-sourced at https://github.com/neuroevobench/neuroevobench under Apache-2.0 license.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Neural & Evolutionary
R.I.P.
๐ป
Ghosted
Progressive Growing of GANs for Improved Quality, Stability, and Variation
R.I.P.
๐ป
Ghosted
Learning both Weights and Connections for Efficient Neural Networks
R.I.P.
๐ป
Ghosted
LSTM: A Search Space Odyssey
R.I.P.
๐ป
Ghosted
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
R.I.P.
๐ป
Ghosted