$ฮต$-shotgun: $ฮต$-greedy Batch Bayesian Optimisation
February 05, 2020 ยท Entered Twilight ยท ๐ Annual Conference on Genetic and Evolutionary Computation
"Last commit was 5.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitignore, LICENSE, README.md, batch_simulation_script.py, eshotgun, eshotgun_results_plots.ipynb, push8_best_solutions.npz, requirements.txt, results, training_data
Authors
George De Ath, Richard M. Everson, Jonathan E. Fieldsend, Alma A. M. Rahat
arXiv ID
2002.01873
Category
cs.LG: Machine Learning
Cross-listed
cs.NE,
stat.ML
Citations
16
Venue
Annual Conference on Genetic and Evolutionary Computation
Repository
https://github.com/georgedeath/eshotgun
โญ 6
Last Checked
1 month ago
Abstract
Bayesian optimisation is a popular, surrogate model-based approach for optimising expensive black-box functions. Given a surrogate model, the next location to expensively evaluate is chosen via maximisation of a cheap-to-query acquisition function. We present an $ฮต$-greedy procedure for Bayesian optimisation in batch settings in which the black-box function can be evaluated multiple times in parallel. Our $ฮต$-shotgun algorithm leverages the model's prediction, uncertainty, and the approximated rate of change of the landscape to determine the spread of batch solutions to be distributed around a putative location. The initial target location is selected either in an exploitative fashion on the mean prediction, or -- with probability $ฮต$ -- from elsewhere in the design space. This results in locations that are more densely sampled in regions where the function is changing rapidly and in locations predicted to be good (i.e close to predicted optima), with more scattered samples in regions where the function is flatter and/or of poorer quality. We empirically evaluate the $ฮต$-shotgun methods on a range of synthetic functions and two real-world problems, finding that they perform at least as well as state-of-the-art batch methods and in many cases exceed their performance.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted