R.I.P.
๐ป
Ghosted
Pretrained Optimization Model for Zero-Shot Black Box Optimization
May 06, 2024 ยท Entered Twilight ยท ๐ Neural Information Processing Systems
Repo contents: BBOB_pkg, GLHF_pkg, __pycache__, bbobOffsets_dim100.pkl, bbobOffsets_dim30.pkl, cecoffsets.pkl, ckpt, demo.sh, imgs, imports.py, main.py, readme.md, utils.py
Authors
Xiaobin Li, Kai Wu, Yujian Betterest Li, Xiaoyu Zhang, Handing Wang, Jing Liu
arXiv ID
2405.03728
Category
cs.NE: Neural & Evolutionary
Cross-listed
cs.AI
Citations
12
Venue
Neural Information Processing Systems
Repository
https://github.com/ninja-wm/POM/
โญ 4
Last Checked
1 month ago
Abstract
Zero-shot optimization involves optimizing a target task that was not seen during training, aiming to provide the optimal solution without or with minimal adjustments to the optimizer. It is crucial to ensure reliable and robust performance in various applications. Current optimizers often struggle with zero-shot optimization and require intricate hyperparameter tuning to adapt to new tasks. To address this, we propose a Pretrained Optimization Model (POM) that leverages knowledge gained from optimizing diverse tasks, offering efficient solutions to zero-shot optimization through direct application or fine-tuning with few-shot samples. Evaluation on the BBOB benchmark and two robot control tasks demonstrates that POM outperforms state-of-the-art black-box optimization methods, especially for high-dimensional tasks. Fine-tuning POM with a small number of samples and budget yields significant performance improvements. Moreover, POM demonstrates robust generalization across diverse task distributions, dimensions, population sizes, and optimization horizons. For code implementation, see https://github.com/ninja-wm/POM/.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Neural & Evolutionary
R.I.P.
๐ป
Ghosted
Progressive Growing of GANs for Improved Quality, Stability, and Variation
R.I.P.
๐ป
Ghosted
Learning both Weights and Connections for Efficient Neural Networks
R.I.P.
๐ป
Ghosted
LSTM: A Search Space Odyssey
R.I.P.
๐ป
Ghosted
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
R.I.P.
๐ป
Ghosted