A Scalable and Cloud-Native Hyperparameter Tuning System

June 03, 2020 Β· Declared Dead Β· πŸ› arXiv.org

πŸ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Johnu George, Ce Gao, Richard Liu, Hou Gang Liu, Yuan Tang, Ramdoot Pydipaty, Amit Kumar Saha arXiv ID 2006.02085 Category cs.DC: Distributed Computing Cross-listed cs.LG Citations 13 Venue arXiv.org Repository https://github.com/kubeflow/katib} Last Checked 1 month ago
Abstract
In this paper, we introduce Katib: a scalable, cloud-native, and production-ready hyperparameter tuning system that is agnostic of the underlying machine learning framework. Though there are multiple hyperparameter tuning systems available, this is the first one that caters to the needs of both users and administrators of the system. We present the motivation and design of the system and contrast it with existing hyperparameter tuning systems, especially in terms of multi-tenancy, scalability, fault-tolerance, and extensibility. It can be deployed on local machines, or hosted as a service in on-premise data centers, or in private/public clouds. We demonstrate the advantage of our system using experimental results as well as real-world, production use cases. Katib has active contributors from multiple companies and is open-sourced at \emph{https://github.com/kubeflow/katib} under the Apache 2.0 license.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Distributed Computing

Died the same way β€” πŸ’€ 404 Not Found