DLOPT: Deep Learning Optimization Library

July 10, 2018 Β· Entered Twilight Β· πŸ› arXiv.org

πŸŒ… TWILIGHT: Old Age
Predates the code-sharing era β€” a pioneer of its time

"Last commit was 5.0 years ago (β‰₯5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, CONTRIBUTING.md, LICENSE, README.md, data, dlopt, docs, etc, examples, publications, requirements.txt, setup.py

Authors Andrés Camero, Jamal Toutouh, Enrique Alba arXiv ID 1807.03523 Category cs.LG: Machine Learning Cross-listed cs.NE, stat.ML Citations 9 Venue arXiv.org Repository https://github.com/acamero/dlopt ⭐ 12 Last Checked 1 month ago
Abstract
Deep learning hyper-parameter optimization is a tough task. Finding an appropriate network configuration is a key to success, however most of the times this labor is roughly done. In this work we introduce a novel library to tackle this problem, the Deep Learning Optimization Library: DLOPT. We briefly describe its architecture and present a set of use examples. This is an open source project developed under the GNU GPL v3 license and it is freely available at https://github.com/acamero/dlopt
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

πŸ“œ Similar Papers

In the same crypt β€” Machine Learning