TY - JOUR
T1 - Unsupervised emotional state classification through physiological parameters for social robotics applications
AU - Fiorini, Laura
AU - Mancioppi, Gianmaria
AU - Semeraro, Francesco
AU - Fujita, Hamido
AU - Cavallo, Filippo
N1 - Funding Information:
Authors would like to thank the people involved in the experiment for their precious support. This work was founded by“SocIal ROBOTics for active and healthy ageing” (SI-ROBOTICS) project founded by the Italian “Ministero dell’Istruzione, dell’Università e della Ricerca” under the framework “PON - Ricerca e Innovazione 2014–2020” , Grant Agreement ARS01_01120 .
Publisher Copyright:
© 2019 Elsevier B.V.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/2/29
Y1 - 2020/2/29
N2 - Future social robots should have personalized behaviors based on user emotional state to fit more in ordinary users’ activities and to improve the human–robot interaction. Several, literature works use cameras to record emotions. However, these approaches may not be effective in everyday life, due to camera obstructions and different types of stimulation, which can be related also with the interaction with other human beings. Therefore, in this work, it is investigated the electrocardiogram, the electrodermal activity, and the electric brain activity physiological signals as main informative channels. The aforementioned signals have been acquired through the use of a wireless wearable sensor network. An experimental methodology was proposed to induce three different emotional states by means of social interaction. Two different combinations of sensors were analyzed using three different time-window frames (180s, 150s, and 120s) and classified with three unsupervised machine learning approaches (K-Means, K-medoids and Self-organizing maps). Finally, their classification performances were compared to the ones obtained by four commonly used supervised techniques (i.e. Support Vector Machine, Decision Tree and k-nearest neighbor) to discuss the optimal combination of sensors, time-window length, and unsupervised classifier. Fifteen healthy young participants were recruited in the study and more than 100 instances were analyzed. The proposed approaches achieve an accuracy of 77% in the best-unsupervised case and 85% with the best-supervised ones.
AB - Future social robots should have personalized behaviors based on user emotional state to fit more in ordinary users’ activities and to improve the human–robot interaction. Several, literature works use cameras to record emotions. However, these approaches may not be effective in everyday life, due to camera obstructions and different types of stimulation, which can be related also with the interaction with other human beings. Therefore, in this work, it is investigated the electrocardiogram, the electrodermal activity, and the electric brain activity physiological signals as main informative channels. The aforementioned signals have been acquired through the use of a wireless wearable sensor network. An experimental methodology was proposed to induce three different emotional states by means of social interaction. Two different combinations of sensors were analyzed using three different time-window frames (180s, 150s, and 120s) and classified with three unsupervised machine learning approaches (K-Means, K-medoids and Self-organizing maps). Finally, their classification performances were compared to the ones obtained by four commonly used supervised techniques (i.e. Support Vector Machine, Decision Tree and k-nearest neighbor) to discuss the optimal combination of sensors, time-window length, and unsupervised classifier. Fifteen healthy young participants were recruited in the study and more than 100 instances were analyzed. The proposed approaches achieve an accuracy of 77% in the best-unsupervised case and 85% with the best-supervised ones.
KW - Cluster analysis
KW - Emotional state detection
KW - Physiological signals
KW - Pleasure
KW - Social robotics
UR - http://dx.doi.org/10.1016/j.knosys.2019.105217
U2 - 10.1016/j.knosys.2019.105217
DO - 10.1016/j.knosys.2019.105217
M3 - Article
SN - 0950-7051
VL - 190
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 105217
ER -