Approaches for Intelligent Robot Grasping and Manipulation via Human Demonstration

  • Ainur Begalinova

Student thesis: Phd


During grasping and manipulation, robots must deal with a large number of variables that relate to the object parameters, and adapt to external unforeseen situations, all in unstructured environments. This frequently causes robots to fail to complete the manipulation task. A fundamental problem in robotics is the complexity of operating in high-dimensional, unstructured and dynamic state spaces that can produce uncertainties for a robot to deal with. Such complexity of operating requires reliable perception to function effectively, additionally understanding and monitoring system performance in real-time given wide range of input signals, to successfully adapt grip forces to the changes in contact that could impact the stable grasp. A common approach undertaken by researchers to overcome such limitations is incorporating complex sensors that can reliably monitor changes. However, simply incorporating such sensors in the robot manipulator control loop will cause equivalent complexity of dealing with a wide range of input signals. This thesis addresses the above problem by exploring a practical approach to utilise affordable tactile sensors to detect slippage in manipulation, and estimate object parameters for stable grasping using Machine Learning techniques. As humans excel at performing dexterous grasping in everyday life, the best practice of introducing human skills in robotic grasping is via `learning by demonstration', which helps to transfer the skills of dexterity rather than learning set of affordable grasps for an individual object. This approach eliminates the need for tracking high-dimensional state space, and helps to learn essential skills of stability in grasping and manipulation in robots. We show the methods of utilising Spatio-temporal features of tactile sensing for learning and enhancing the capabilities of robots in estimating properties such as texture, stiffness, weight and detecting the grasp type from the demonstration. We achieved an accuracy of 88.75% and 88.01% in texture classification using k-NN and LSTM methods, used clustering methods to detect object stiffness groups, estimated weight with an average error of 14.73 gram, and classified the grasp taxonomy of a human demonstrator with the accuracy of 96.48%. We also introduce a practical method for real-time slip detection in the precision grasping of unseen objects with the use of low-cost, adaptable sensors, achieving 93.13% accuracy in off-line mode and 70.00% and 88.89% accuracy in detecting partial and full slippage in robotic grasping in online mode.
Date of Award1 Aug 2020
Original languageEnglish
Awarding Institution
  • The University of Manchester
SupervisorBarry Lennox (Supervisor), Riza Theresa Batista-Navarro (Supervisor) & Ross King (Supervisor)


  • robot grasping and manipulation
  • machine learning
  • learning by demonstration

Cite this