Visual attention and object naming in humanoid robots using a bio-inspired spiking neural network

Daniel Hernández García, Samantha Adams, Alex Rast, Thomas Wennekers, Steve Furber, Angelo Cangelosi

Research output: Contribution to journalArticlepeer-review


Recent advances in behavioural and computational neuroscience, cognitive robotics, and in the hardware implementation of large-scale neural networks, provide the opportunity for an accelerated understanding of brain functions and for the design of interactive robotic systems based on brain-inspired control systems. This is especially the case in the domain of action and language learning, given the significant scientific and technological developments in this field. In this work we describe how a neuroanatomically grounded spiking neural network for visual attention has been extended with a word learning capability and integrated with the iCub humanoid robot to demonstrate attention-led object naming. Experiments were carried out with both a simulated and a real iCub robot platform with successful results. The iCub robot is capable of associating a label to an object with a ‘preferred’ orientation when visual and word stimuli are presented concurrently in the scene, as well as attending to said object, thus naming it. After learning is complete, the name of the object can be recalled successfully when only the visual input is present, even when the object has been moved from its original position or when other objects are present as distractors.
Original languageEnglish
Pages (from-to)56-71
JournalRobotics and Autonomous Systems
Early online date5 Mar 2018
Publication statusPublished - 1 Jun 2018


  • Spiking neural networks


Dive into the research topics of 'Visual attention and object naming in humanoid robots using a bio-inspired spiking neural network'. Together they form a unique fingerprint.

Cite this