Isolation-based Debugging for Neural Networks

Jialuo Chen, Jingyi Wang, Youcheng Sun, Peng Cheng, Jiming Chen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Neural networks (NNs) are known to have diverse defects such as adversarial examples, backdoor and discrimination, raising great concerns for their reliability. While NN testing can effectively expose these problems to a significant degree, their root cause in the network is subject to further examination. In this work, inspired by the idea of debugging in traditional software for failure isolation, we propose a novel neuron isolation based framework for
debugging neural networks, shortly IDNN. Given a buggy NN that exhibits certain undesired behaviors (e.g., backdoor), the goal of IDNN is to identify the most critical and minimal set of neurons that are responsible for the behavior. Notably, such isolation is conducted with the objective that: by simply ‘freezing’ these neurons, the model’s desired requirement can be satisfied, resulting in a much more efficient model repair compared to computationally expensive retraining or weight optimization as in existing methods. We conduct extensive experiments to evaluate IDNN, on a diverse set of NN models across five datasets, for solving three debugging tasks, including backdoor, unfairness, and weak class. As a lightweight framework, IDNN outperforms state-of-the-art baselines by successfully identifying and isolating a very small set of responsible neurons, demonstrating superior generalization performance for all the tasks. IDNN is released at [1].
Original languageEnglish
Title of host publicationThe ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA)
Publication statusAccepted/In press - 1 Mar 2024

Fingerprint

Dive into the research topics of 'Isolation-based Debugging for Neural Networks'. Together they form a unique fingerprint.

Cite this