Biologically inspired neural computation

  • Adam Perrett

Student thesis: Phd

Abstract

Models of intelligence can come in many forms, from concept driven approaches such as formal mathematical reasoning to data driven approaches such as machine learning. Current state of the art approaches fall into the second category, requiring vast amounts of data to form statistical representations within the architecture of neural networks. This is in stark contrast to biological brains whose neural networks can learn with limited examples and training time. Biology originally inspired the neural network but there is still much more to be learned from nature about how to construct and train neural architectures. By exploring techniques employed by biology it can be possible to overcome the challenges of modern machine learning algorithms. Biological brains can be trained on tasks sequentially without forgetting previously gathered information and data integration is performed online and in real-time, except for processing done during sleep. The brain also only consumes 12W of energy, which is a far cry from the energy budget of CPU and GPU implementations of neural networks. The research described in this thesis first investigates the use of biologically inspired models of visual attention, on the SpiNNaker neuromorphic hardware, creating an event-driven low latency model of visual saliency. Following this, biologically plausible training algorithms are examined with the e-prop learning algorithm being instantiated on SpiNNaker to explore the challenges faced when learning using only locally available information. Finally, abstractions of dendritic nonlinearities are co-opted for use in tandem with neurogenesis to create a learning architecture, which does not rely on gradient descent whilst retaining previously learned information. It is shown to reach similar levels of performance to a network trained using Adam optimisation with less presentations of data samples on a number of benchmark tasks.
Date of Award31 Dec 2022
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorOliver Rhodes (Supervisor) & Steve Furber (Supervisor)

Keywords

  • iCub
  • SpiNNaker
  • neuromorphic
  • visual attention
  • regression
  • e-prop
  • neurogenesis
  • dendrites
  • classification
  • reinforcement learning
  • machine learning
  • spiking neural networks
  • biologically inspired
  • gradient descent

Cite this

'