TY - GEN
T1 - Interactive teaching and experience extraction for learning about objects and robot activities
AU - Lim, G.H.
AU - Oliveira, M.
AU - Mokhtari, V.
AU - Hamidreza Kasaei, S.
AU - Chauhan, A.
AU - Seabra Lopes, L.
AU - Tome, A.M.
PY - 2014
Y1 - 2014
N2 - Intelligent service robots should be able to improve their knowledge from accumulated experiences through continuous interaction with the environment, and in particular with humans. A human user may guide the process of experience acquisition, teaching new concepts, or correcting insufficient or erroneous concepts through interaction. This paper reports on work towards interactive learning of objects and robot activities in an incremental and open-ended way. In particular, this paper addresses human-robot interaction and experience gathering. The robot's ontology is extended with concepts for representing human-robot interactions as well as the experiences of the robot. The human-robot interaction ontology includes not only instructor teaching activities but also robot activities to support appropriate feedback from the robot. Two simplified interfaces are implemented for the different types of instructions including the teach instruction, which triggers the robot to extract experiences. These experiences, both in the robot activity domain and in the perceptual domain, are extracted and stored in memory, and they are used as input for learning methods. The functionalities described above are completely integrated in a robot architecture, and are demonstrated in a PR2 robot.
AB - Intelligent service robots should be able to improve their knowledge from accumulated experiences through continuous interaction with the environment, and in particular with humans. A human user may guide the process of experience acquisition, teaching new concepts, or correcting insufficient or erroneous concepts through interaction. This paper reports on work towards interactive learning of objects and robot activities in an incremental and open-ended way. In particular, this paper addresses human-robot interaction and experience gathering. The robot's ontology is extended with concepts for representing human-robot interactions as well as the experiences of the robot. The human-robot interaction ontology includes not only instructor teaching activities but also robot activities to support appropriate feedback from the robot. Two simplified interfaces are implemented for the different types of instructions including the teach instruction, which triggers the robot to extract experiences. These experiences, both in the robot activity domain and in the perceptual domain, are extracted and stored in memory, and they are used as input for learning methods. The functionalities described above are completely integrated in a robot architecture, and are demonstrated in a PR2 robot.
UR - http://www.scopus.com/inward/record.url?eid=2-s2.0-84933051386&partnerID=MN8TOARS
U2 - 10.1109/ROMAN.2014.6926246
DO - 10.1109/ROMAN.2014.6926246
M3 - Conference contribution
BT - IEEE RO-MAN 2014 - 23rd IEEE International Symposium on Robot and Human Interactive Communication: Human-Robot Co-Existence: Adaptive Interfaces and Systems for Daily Life, Therapy, Assistance and Socially Engaging Interactions
ER -