High-Dimensional Distributed Sparse Classification with Scalable Communication-Efficient Global Updates

July 08, 2024 ยท Declared Dead ยท ๐Ÿ› Knowledge Discovery and Data Mining

๐Ÿ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Fred Lu, Ryan R. Curtin, Edward Raff, Francis Ferraro, James Holt arXiv ID 2407.06346 Category cs.LG: Machine Learning Cross-listed cs.DC, stat.ML Citations 3 Venue Knowledge Discovery and Data Mining Repository https://github.com/FutureComputing4AI/ProxCSL} Last Checked 1 month ago
Abstract
As the size of datasets used in statistical learning continues to grow, distributed training of models has attracted increasing attention. These methods partition the data and exploit parallelism to reduce memory and runtime, but suffer increasingly from communication costs as the data size or the number of iterations grows. Recent work on linear models has shown that a surrogate likelihood can be optimized locally to iteratively improve on an initial solution in a communication-efficient manner. However, existing versions of these methods experience multiple shortcomings as the data size becomes massive, including diverging updates and efficiently handling sparsity. In this work we develop solutions to these problems which enable us to learn a communication-efficient distributed logistic regression model even beyond millions of features. In our experiments we demonstrate a large improvement in accuracy over distributed algorithms with only a few distributed update steps needed, and similar or faster runtimes. Our code is available at \url{https://github.com/FutureComputing4AI/ProxCSL}.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ’€ 404 Not Found