Using ATLAS@Home to exploit extra CPU from busy grid sites
November 29, 2018 ยท Declared Dead ยท ๐ Computing and Software for Big Science
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Wenjing Wu, David Cameron, Qing Di
arXiv ID
1811.12578
Category
physics.comp-ph
Cross-listed
cs.DC,
hep-ex
Citations
3
Venue
Computing and Software for Big Science
Last Checked
1 month ago
Abstract
Grid computing typically provides most of the data processing resources for large High Energy Physics experiments. However typical grid sites are not fully utilized by regular workloads. In order to increase the CPU utilization of these grid sites, the ATLAS@Home volunteer computing framework can be used as a backfilling mechanism. Results show an extra 15% to 42% of CPU cycles can be exploited by backfilling grid sites running regular workloads while the overall CPU utilization can remain over 90%. Backfilling has no impact on the failure rate of the grid jobs, and the impact on the CPU efficiency of grid jobs varies from 1% to 11% depending on the configuration of the site. In addition the throughput of backfill jobs in terms of CPU time per simulated event is the same as for resources dedicated to ATLAS@Home. This approach is sufficiently generic that it can easily be extended to other clusters.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ physics.comp-ph
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Deep Potential Molecular Dynamics: a scalable model with the accuracy of quantum mechanics
R.I.P.
๐ป
Ghosted
Heterogeneous Parallelization and Acceleration of Molecular Dynamics Simulations in GROMACS
R.I.P.
๐ป
Ghosted
By-passing the Kohn-Sham equations with machine learning
R.I.P.
๐ป
Ghosted
Machine Learning of coarse-grained Molecular Dynamics Force Fields
R.I.P.
๐ป
Ghosted
Towards Physics-informed Deep Learning for Turbulent Flow Prediction
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted