Exploring loss function topology with cyclical learning rates
February 14, 2017 ยท Entered Twilight ยท ๐ arXiv.org
"Last commit was 8.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: Instructions for adding CLR to Caffe.pdf, LICENSE, LRrange-solver.prototxt, README.md, architectures, clrsolver.prototxt, interpolation, solver.prototxt, train.sh
Authors
Leslie N. Smith, Nicholay Topin
arXiv ID
1702.04283
Category
cs.LG: Machine Learning
Cross-listed
cs.NE
Citations
27
Venue
arXiv.org
Repository
https://github.com/lnsmith54/exploring-loss
โญ 22
Last Checked
2 months ago
Abstract
We present observations and discussion of previously unreported phenomena discovered while training residual networks. The goal of this work is to better understand the nature of neural networks through the examination of these new empirical results. These behaviors were identified through the application of Cyclical Learning Rates (CLR) and linear network interpolation. Among these behaviors are counterintuitive increases and decreases in training loss and instances of rapid training. For example, we demonstrate how CLR can produce greater testing accuracy than traditional training despite using large learning rates. Files to replicate these results are available at https://github.com/lnsmith54/exploring-loss
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted