Coverage Guided Testing for Recurrent Neural Networks

Wei Huang, Youcheng Sun, Xingyu Zhao, James Sharp, Wenjie Ruan, Jie Meng, Xiaowei Huang

Research output: Contribution to journalArticlepeer-review

102 Downloads (Pure)

Abstract

Recurrent neural networks (RNNs) have been applied to a broad range of applications including natural language processing, drug discovery, and video recognition. However, their vulnerability to input perturbation is also exposed. Aligning with a view from software defect detection, this paper aims to develop a coverage guided testing approach to systematically exploit the internal behaviour of RNNs, with high possibility of detecting defects. Technically, the long short term memory network (LSTM), a major class of RNN, is thoroughly studied. A family of three test metrics are designed to quantify not only the
values but also the temporal relations (including both step-wise and bounded-length) exhibited when LSTM processing inputs. A genetic algorithm is applied to efficiently generate test cases. Based on these, we develop a tool testRNN, and extensively evaluate testRNN on a set of LSTM benchmarks. Experiments confirm that testRNN has several advantages over the state-of-art tool DeepStellar and attack-based defect detection methods, owing to its working with finer temporal semantics and the consideration of the naturalness of input perturbation. Furthermore, testRNN enables meaningful information to be collected and exhibited for users to understand the testing results, which is an important step towards interpretable neural network testing.
Original languageEnglish
JournalIEEE Transactions on Reliability
Early online date10 Jun 2021
DOIs
Publication statusPublished - 10 Jun 2021

Keywords

  • Coverage-guided testing
  • Logic gates
  • Measurement
  • Recurrent neural networks
  • Semantics
  • Software
  • Testing
  • Tools
  • coverage metrics
  • recurrent neural networks (RNNs)
  • test case generation

Fingerprint

Dive into the research topics of 'Coverage Guided Testing for Recurrent Neural Networks'. Together they form a unique fingerprint.

Cite this