Robustness to Noisy Synaptic Weights in Spiking Neural Networks

Chen Li, Runze Chen, Christoforos Moutafis, Steve Furber

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

212 Downloads (Pure)


Spiking neural networks (SNNs) are promising neural network models to achieve power-efficient and event-based computing on neuromorphic hardware. SNNs inherently contain noise and are robust to noisy inputs as well as noise related to the discrete 1-bit spike. In this paper, we find that SNNs are more robust to Gaussian noise in synaptic weights than artificial neural networks (ANNs) under some conditions. This finding will enhance our understanding of the neural dynamics in SNNs and of the advantages of SNNs compared with ANNs. Our results imply the possibility of using high-performance cutting-edge materials with intrinsic noise as an information storage medium in SNNs.
Original languageEnglish
Title of host publication2020 International Joint Conference on Neural Networks (IJCNN)
Place of PublicationGlasgow, United Kingdom, United Kingdom
Number of pages8
ISBN (Electronic)978-1-7281-6926-2
ISBN (Print)978-1-7281-6927-9
Publication statusPublished - 28 Sept 2020


  • Spiking Neural Network
  • Artificial Neural Network
  • noisy weights
  • Gaussian noise


Dive into the research topics of 'Robustness to Noisy Synaptic Weights in Spiking Neural Networks'. Together they form a unique fingerprint.

Cite this