TY - GEN
T1 - Development of Human-Robot Communication Technologies for Future Interaction Experiments
AU - Almada Campos, Ana Christina
AU - Adorno, Bruno Vilhena
N1 - Publisher Copyright:
© 2020 IEEE.
Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.
PY - 2020/11/9
Y1 - 2020/11/9
N2 - This paper describes our development of human-robot communication technologies, which will be used in future experiments to investigate the effects of different types of communication in human-robot interactions. Using the human kinematic chain and facial points, obtained with methods from the literature, we propose geometric analyses for recognition and interpretation of human pointing gestures and gaze direction. We developed a virtual agent with voice, facial expressions, and gaze direction to interact with humans. The pointing gestures system presented mean success rate of 77.8% (s.d. 16.4%) in experiments and the implementation of the facial points detection algorithm resulted in mean errors of 0.52 cm (s.d. 0.29 cm) in the horizontal iris centers distance and 0.67 cm (s.d. 0.31 cm) in the vertical distance between eyes and mouth, both important attributes to estimate human gaze direction. Those results are satisfactory for the intended application.
AB - This paper describes our development of human-robot communication technologies, which will be used in future experiments to investigate the effects of different types of communication in human-robot interactions. Using the human kinematic chain and facial points, obtained with methods from the literature, we propose geometric analyses for recognition and interpretation of human pointing gestures and gaze direction. We developed a virtual agent with voice, facial expressions, and gaze direction to interact with humans. The pointing gestures system presented mean success rate of 77.8% (s.d. 16.4%) in experiments and the implementation of the facial points detection algorithm resulted in mean errors of 0.52 cm (s.d. 0.29 cm) in the horizontal iris centers distance and 0.67 cm (s.d. 0.31 cm) in the vertical distance between eyes and mouth, both important attributes to estimate human gaze direction. Those results are satisfactory for the intended application.
UR - http://www.scopus.com/inward/record.url?scp=85100294194&partnerID=8YFLogxK
U2 - 10.1109/LARS/SBR/WRE51543.2020.9306965
DO - 10.1109/LARS/SBR/WRE51543.2020.9306965
M3 - Conference contribution
AN - SCOPUS:85100294194
T3 - 2020 Latin American Robotics Symposium, 2020 Brazilian Symposium on Robotics and 2020 Workshop on Robotics in Education, LARS-SBR-WRE 2020
BT - 2020 Latin American Robotics Symposium, 2020 Brazilian Symposium on Robotics and 2020 Workshop on Robotics in Education, LARS-SBR-WRE 2020
A2 - Goncalves, Luiz Marcos Garcia
A2 - Drews Junior, Paulo Lilles Jorge
A2 - da Silva, Bruno Marques Ferreira
A2 - Fernandes Curvelo, Carla da Costa
A2 - Fabro, Joao Alberto
A2 - dos Santos, Davi Henrique
A2 - de Melo, Julio Cesar Paulino
PB - IEEE
T2 - 17th Latin American Robotics Symposium, 12th Brazilian Symposium on Robotics and 11th Workshop on Robotics in Education, LARS-SBR-WRE 2020
Y2 - 9 November 2020 through 12 November 2020
ER -