Sparsely constrained neural networks for model discovery of PDEs

November 09, 2020 ยท Declared Dead ยท ๐Ÿ› AAAI Spring Symposium: MLPS

๐Ÿ’€ CAUSE OF DEATH: 404 Not Found
Code link is broken/dead
Authors Gert-Jan Both, Gijs Vermarien, Remy Kusters arXiv ID 2011.04336 Category cs.LG: Machine Learning Cross-listed physics.comp-ph Citations 6 Venue AAAI Spring Symposium: MLPS Repository https://github.com/PhIMaL/DeePyMoD} Last Checked 1 month ago
Abstract
Sparse regression on a library of candidate features has developed as the prime method to discover the partial differential equation underlying a spatio-temporal data-set. These features consist of higher order derivatives, limiting model discovery to densely sampled data-sets with low noise. Neural network-based approaches circumvent this limit by constructing a surrogate model of the data, but have to date ignored advances in sparse regression algorithms. In this paper we present a modular framework that dynamically determines the sparsity pattern of a deep-learning based surrogate using any sparse regression technique. Using our new approach, we introduce a new constraint on the neural network and show how a different network architecture and sparsity estimator improve model discovery accuracy and convergence on several benchmark examples. Our framework is available at \url{https://github.com/PhIMaL/DeePyMoD}
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning

Died the same way โ€” ๐Ÿ’€ 404 Not Found