Exploring loss function topology with cyclical learning rates

February 14, 2017 ยท Entered Twilight ยท ๐Ÿ› arXiv.org

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 8.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: Instructions for adding CLR to Caffe.pdf, LICENSE, LRrange-solver.prototxt, README.md, architectures, clrsolver.prototxt, interpolation, solver.prototxt, train.sh

Authors Leslie N. Smith, Nicholay Topin arXiv ID 1702.04283 Category cs.LG: Machine Learning Cross-listed cs.NE Citations 27 Venue arXiv.org Repository https://github.com/lnsmith54/exploring-loss โญ 22 Last Checked 2 months ago
Abstract
We present observations and discussion of previously unreported phenomena discovered while training residual networks. The goal of this work is to better understand the nature of neural networks through the examination of these new empirical results. These behaviors were identified through the application of Cyclical Learning Rates (CLR) and linear network interpolation. Among these behaviors are counterintuitive increases and decreases in training loss and instances of rapid training. For example, we demonstrate how CLR can produce greater testing accuracy than traditional training despite using large learning rates. Files to replicate these results are available at https://github.com/lnsmith54/exploring-loss
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning