Human Language as a Tool for Conceptual Development in Cognitive Robotics

  • Ioanna Giorgi

Student thesis: Phd


We are nearing a prospect in which robots will be an imminent constituent of our society. Humans will be symbiotically empowered by AI (Artificial Intelligence), presuming that these smart artefacts are devised to mimic human mental and somatic attributes. Human-robot synergy can be nurtured by a solid foundation of mutual understanding and seamless communication. Alas, current efforts to devise apt cognitive robotic models focus on low-order cognitive phenomena alone (perception, manipulation, motor coordination, navigation). Drawing on a hypothesised developmental paradigm of human cognitive functions, they have seized prominent aspects of embodied and situated cognition, which roots in motor behaviour and the environment. Yet they offer no clear sight how their blueprints can explain or scale up to high-level cognitive competence. These efforts often overlook a major component that sits at the core of our (human) social interaction: language. Language may be the natural interface between humans and robots, not only as the naive way we communicate our thoughts with others but having the overarching benefit of supporting cognition and intelligence. This dissertation takes direct inspiration from theoretical psychology and revolutionary cognitive robotics perspectives. It seeks to address the near absence of high-level cognitive modelling in the current standpoints of cognitive robotics, advocating to appraise human language as a versatile cognitive tool that can support and enhance cognition. This thesis describes a series of cohesive research efforts conducted in augmenting stages aimed at involving language in robot cognition. Stage 1 models language proficiency in a cognitively sound manner that is closer to how humans develop and/or elaborate language. The significance of this modelling is that it can be used to study other humanlike cognitive aspects, in rooting to brain-inspired principles over fabricated solutions that focus on high accuracy for ad-hoc tasks. This stage models not one but multiple languages jointly, as multilingual competence is assumed to promote cognitive, neural, and social benefits, and to have a far-reaching impact on cognitive control. Yet, language cannot be fully understood if not viewed in relation to our perception, actions and interactions with the environment and the organisms in it. Thus, stage 2 models the symbolic mapping between language, body, and the environment to study how words get their meaning, i.e., how they manifest in the real world as visual and somatosensory information. In the proposed modelling, the learning of words, actions or both journeys from concrete to abstract concepts, where abstractness is a language attribute. With increasing abstractness, the number, and types of actions constructed in response to language to manifest that abstractness increases in a continuum. The results of this novel learning artifice with a humanoid embodied robot suggested that language has an impact on action learning and adaptive behaviour. These assumptions are forged ahead in stage 3 to study the impact of language on further aspects of cognition, such as the ability to categorise, abstract and voluntary control behaviour. Challenging experiments with a humanoid robot demonstrated that language could influence such phenomena. The robot could fathom concepts expressed in distinct languages (cross-lingual cognitive control), showing that labels become part of conceptual representations and can trigger those representations just as well as perception and action. Language appeared to boost cognition by promoting higher-level abstract reasoning, which required properties that were inferred rather than directly observed.
Date of Award31 Dec 2022
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorAngelo Cangelosi (Supervisor)


  • high-level cognitive modelling
  • human language
  • developmental robotics
  • brain-inspired architecture
  • working memory
  • large-scale neural network
  • abstract concepts
  • cognitive robotics
  • language-to-action

Cite this