Ergonomics for Physical Human-Robot Collaboration

  • Lorenzo Rapetti

Student thesis: Phd

Abstract

Humans have an innate ability to collaborate. This translates into the completion of tasks otherwise impossible, or the capability of performing complex actions in an efficient manner. Robots nowadays are endowed with superior physical strength although humans reasoning and adaptability remain still unmatched in many scenarios. The possibility to combine those capabilities has long been considered the pivotal point for an efficient application of robots in real-world scenarios such as manufacturing and healthcare industries. Recently, attention has been raised to humanoid robots and their applications. Anthropomorphism allows to move in human-centred workplaces and improves social interaction with humans. Human-robot symbiosis, however, remains mostly unrealized since robots ability to collaborate is still limited. For a fruitful human-robot collaboration, it is a prerogative to study the efficiency of humans during the interaction with the environment and enhance their safety, which is referred to as ergonomics. This brings us to some fundamental questions in robotics: How to make robots able to physically collaborate with a degree of intelligence? How to make physical human-robot collaboration ergonomics? Besides effective advancements have been made in this regard by the robotics community, interest in humanoid robotics encourages the search for new paradigms to answer them. The present thesis dives into those aspects and attempts to come up with strategies for multi-agent physical collaboration following a holistic approach employing wearable sensing technologies, human modelling, and control theory. The first outcome is a framework for humanoid robots, that enables robot-robot and human-robot physical collaboration. The proposed framework is inspired by humans motor intelligence, and tries to consolidate the existing collaboration schemes with a unified general framework promoting information sharing and partner modelling. The validation is done, using the iCub humanoid robot, in simplified scenarios with collaborative payload lifting tasks. The second outcome is a framework for human perception that allows real-time motion tracking and force estimation through wearable sensors. The perception framework has been validated in different scenarios including human analysis and motion retargeting, and it has been made compatible with both the Xsens and iFeel wearable sensor systems for motion and force tracking.
Date of Award1 Aug 2023
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorAngelo Cangelosi (Supervisor) & Daniele Pucci (Supervisor)

Keywords

  • ergonomics
  • human-robot collaboration
  • humanoid robotics
  • biomechanics

Cite this

'