Abstract Neural Networks

September 11, 2020 ยท Entered Twilight ยท ๐Ÿ› Sensors Applications Symposium

๐ŸŒ… TWILIGHT: Old Age
Predates the code-sharing era โ€” a pioneer of its time

"Last commit was 5.0 years ago (โ‰ฅ5 year threshold)"

Evidence collected by the PWNC Scanner

Repo contents: .gitignore, BUILD, LICENSE, README.md, WORKSPACE, abstract.py, interval_domain.py, requirements.txt, test_abstract_intervals.py

Authors Matthew Sotoudeh, Aditya V. Thakur arXiv ID 2009.05660 Category cs.LG: Machine Learning Cross-listed cs.PL, stat.ML Citations 20 Venue Sensors Applications Symposium Repository https://github.com/95616ARG/abstract_neural_networks โญ 6 Last Checked 1 month ago
Abstract
Deep Neural Networks (DNNs) are rapidly being applied to safety-critical domains such as drone and airplane control, motivating techniques for verifying the safety of their behavior. Unfortunately, DNN verification is NP-hard, with current algorithms slowing exponentially with the number of nodes in the DNN. This paper introduces the notion of Abstract Neural Networks (ANNs), which can be used to soundly overapproximate DNNs while using fewer nodes. An ANN is like a DNN except weight matrices are replaced by values in a given abstract domain. We present a framework parameterized by the abstract domain and activation functions used in the DNN that can be used to construct a corresponding ANN. We present necessary and sufficient conditions on the DNN activation functions for the constructed ANN to soundly over-approximate the given DNN. Prior work on DNN abstraction was restricted to the interval domain and ReLU activation function. Our framework can be instantiated with other abstract domains such as octagons and polyhedra, as well as other activation functions such as Leaky ReLU, Sigmoid, and Hyperbolic Tangent.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Machine Learning