Projects per year
Abstract
Neural networks are a powerful class of non-linear functions. However, their black-box nature makes it difficult to explain their behaviour and certify their safety. Abstraction techniques address this challenge by transforming the neural network into a simpler, over-approximated function. Unfortunately, existing abstraction techniques are slack, which limits their applicability to small local regions of the input domain. In this paper, we propose Global Interval Neural Network Abstractions with Center-Exact Reconstruction
(GINNACER). Our novel abstraction technique produces sound overapproximation bounds over the whole input domain while guaranteeing exact reconstructions for any given local input. Our experiments show that GINNACER is several orders of magnitude tighter than state-of-the-art global abstraction techniques, while being competitive with local ones.
(GINNACER). Our novel abstraction technique produces sound overapproximation bounds over the whole input domain while guaranteeing exact reconstructions for any given local input. Our experiments show that GINNACER is several orders of magnitude tighter than state-of-the-art global abstraction techniques, while being competitive with local ones.
Original language | English |
---|---|
Pages (from-to) | 344-357 |
Number of pages | 14 |
Journal | Neural Networks |
Volume | 165 |
Early online date | 7 Jun 2023 |
DOIs | |
Publication status | Published - 1 Aug 2023 |
Keywords
- Abstract Interpretation
- Global abstraction
- Neural networks
Fingerprint
Dive into the research topics of 'Towards Global Neural Network Abstractions with Locally-Exact Reconstruction'. Together they form a unique fingerprint.Projects
- 1 Active
-
EnnCore: End-to-End Conceptual Guarding of Neural Architectures
Cordeiro, L. (PI), Brown, G. (CoI), Freitas, A. (CoI), Luján, M. (CoI) & Mustafa, M. (CoI)
1/02/21 → 31/12/25
Project: Research