Stochastic Conjugate Gradient Algorithm with Variance Reduction

October 27, 2017 ยท Entered Twilight ยท ๐Ÿ› IEEE Transactions on Neural Networks and Learning Systems

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 8.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .idea, CMakeLists.txt, README.md, build, cgvr_exp.py, liblinear, main, main.cpp, model, output, run_utils.py, run_utils.pyc, split_dataset.py, svm, svm_exp.py, svm_results-lindual, svm_results-test, test, test.cpp, test_fig.py

Authors Xiao-Bo Jin, Xu-Yao Zhang, Kaizhu Huang, Guang-Gang Geng arXiv ID 1710.09979 Category cs.LG: Machine Learning Cross-listed cs.CV, stat.ML Citations 62 Venue IEEE Transactions on Neural Networks and Learning Systems Repository https://github.com/xbjin/cgvr โญ 10 Last Checked 1 month ago
Abstract
Conjugate gradient (CG) methods are a class of important methods for solving linear equations and nonlinear optimization problems. In this paper, we propose a new stochastic CG algorithm with variance reduction and we prove its linear convergence with the Fletcher and Reeves method for strongly convex and smooth functions. We experimentally demonstrate that the CG with variance reduction algorithm converges faster than its counterparts for four learning models, which may be convex, nonconvex or nonsmooth. In addition, its area under the curve performance on six large-scale data sets is comparable to that of the LIBLINEAR solver for the L2-regularized L2-loss but with a significant improvement in computational efficiency
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning