๐ฎ
๐ฎ
The Ethereal
DeepAbstract: Neural Network Abstraction for Accelerating Verification
June 24, 2020 ยท The Ethereal ยท ๐ Automated Technology for Verification and Analysis
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Pranav Ashok, Vahid Hashemi, Jan Kลetรญnskรฝ, Stefanie Mohr
arXiv ID
2006.13735
Category
cs.LO: Logic in CS
Cross-listed
cs.AI,
cs.LG
Citations
51
Venue
Automated Technology for Verification and Analysis
Last Checked
1 month ago
Abstract
While abstraction is a classic tool of verification to scale it up, it is not used very often for verifying neural networks. However, it can help with the still open task of scaling existing algorithms to state-of-the-art network architectures. We introduce an abstraction framework applicable to fully-connected feed-forward neural networks based on clustering of neurons that behave similarly on some inputs. For the particular case of ReLU, we additionally provide error bounds incurred by the abstraction. We show how the abstraction reduces the size of the network, while preserving its accuracy, and how verification results on the abstract network can be transferred back to the original network.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Logic in CS
๐ฎ
๐ฎ
The Ethereal
Safe Reinforcement Learning via Shielding
๐ฎ
๐ฎ
The Ethereal
Formal Verification of Piece-Wise Linear Feed-Forward Neural Networks
๐ฎ
๐ฎ
The Ethereal
Heterogeneous substitution systems revisited
๐ฎ
๐ฎ
The Ethereal
Omega-Regular Objectives in Model-Free Reinforcement Learning
๐ฎ
๐ฎ
The Ethereal