DLOPT: Deep Learning Optimization Library
July 10, 2018 Β· Entered Twilight Β· π arXiv.org
"Last commit was 5.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitignore, CONTRIBUTING.md, LICENSE, README.md, data, dlopt, docs, etc, examples, publications, requirements.txt, setup.py
Authors
AndrΓ©s Camero, Jamal Toutouh, Enrique Alba
arXiv ID
1807.03523
Category
cs.LG: Machine Learning
Cross-listed
cs.NE,
stat.ML
Citations
9
Venue
arXiv.org
Repository
https://github.com/acamero/dlopt
β 12
Last Checked
1 month ago
Abstract
Deep learning hyper-parameter optimization is a tough task. Finding an appropriate network configuration is a key to success, however most of the times this labor is roughly done. In this work we introduce a novel library to tackle this problem, the Deep Learning Optimization Library: DLOPT. We briefly describe its architecture and present a set of use examples. This is an open source project developed under the GNU GPL v3 license and it is freely available at https://github.com/acamero/dlopt
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
π»
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
π»
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
π»
Ghosted