Selecting Stable Safe Configurations for Systems Modelled by Neural Networks with ReLU Activation

Franz Brauße, Zurab Khasidashvili, Konstantin Korovin

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

16 Downloads (Pure)

Abstract

Combining machine learning with constraint solving and formal methods is an interesting new direction in research with a wide range of safety critical applications. Our focus in this work is on analyzing Neural Networks with Rectified
Linear Activation Function (NN-ReLU). The existing, very recent research works in this direction describe multiple approaches to satisfiability checking for constraints on NN-ReLU output. Here we extend this line of work in two orthogonal directions: We propose an algorithm for finding configurations of NN-ReLU that are (1) safe and (2) stable. We assume that the inputs of the NN-ReLU are divided into existentially and universally quantified variables, where the former represent the parameters for configuring the NN-ReLU and the latter represent (possibly constrained) free inputs. We are looking for (1) values of the
configuration parameters for which the NN-ReLU output satisfies a given constraint for any legal values of the input variables (the safety requirement); and (2) such that the entire family of configurations with configuration variable values close to a safe configuration is also safe (the stability requirement). To our knowledge this is the first work that proposes SMT-based algorithms for searching safe and stable configuration parameters for systems modelled using neural networks. We experimentally evaluate our algorithm on NN-ReLUs trained on a set of real-life datasets originating from an industrial CAD application at Intel.
Original languageEnglish
Title of host publicationFormal Methods in Computer-Aided Design (FMCAD'20)
DOIs
Publication statusE-pub ahead of print - 22 Dec 2020

Fingerprint

Dive into the research topics of 'Selecting Stable Safe Configurations for Systems Modelled by Neural Networks with ReLU Activation'. Together they form a unique fingerprint.

Cite this