Repository for Reusing Artifacts of Artificial Neural Networks
March 30, 2020 ยท Entered Twilight ยท ๐ arXiv.org
"Last commit was 6.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .README.md.html, .gitignore, .mvn, FFMPEG, LICENSE, README.md, assets, bin, build.cmd, buildAndRun.cmd, buildWar.bat, buildWarWithoutTests.bat, clearDatabase.sh, conf, mvnw, mvnw.cmd, pom.xml, run.bat, src, test.cmd
Authors
Javad Ghofrani, Ehsan Kozegar, Mohammad Divband Soorati, Arezoo Bozorgmehr, Hongfei Chen, Maximilian Naake
arXiv ID
2003.13619
Category
cs.LG: Machine Learning
Cross-listed
cs.SE
Citations
0
Venue
arXiv.org
Repository
https://github.com/ghofrani85/RAN2
โญ 1
Last Checked
2 months ago
Abstract
Artificial Neural Networks (ANNs) replaced conventional software systems in various domains such as machine translation, natural language processing, and image processing. So, why do we need an repository for artificial neural networks? Those systems are developed with labeled data and we have strong dependencies between the data that is used for training and testing our network. Another challenge is the data quality as well as reuse-ability. There we are trying to apply concepts from classic software engineering that is not limited to the model, while data and code haven't been dealt with mostly in other projects. The first question that comes to mind might be, why don't we use GitHub, a well known widely spread tool for reuse, for our issue. And the reason why is that GitHub, although very good in its class is not developed for machine learning appliances and focuses more on software reuse. In addition to that GitHub does not allow to execute the code directly on the platform which would be very convenient for collaborative work on one project.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted