TY - JOUR
T1 - Spatiotemporal Analysis by Deep Learning of Gait Signatures from Floor Sensors
AU - Alharthi, Abdullah
AU - Casson, Alex
AU - Ozanyan, Krikor
N1 - Funding Information:
Manuscript received March 25, 2021; accepted April 30, 2021. Date of publication May 7, 2021; date of current version July 30, 2021. The work of Abdullah S. Alharthi was supported by the Government of Saudi Arabia. The associate editor coordinating the review of this article and approving it for publication was Dr. Ying Zhang. (Corresponding author: Abdullah S. Alharthi.) The authors are with the Department of Electrical and Electronic Engineering, The University of Manchester, Manchester M13 9PL, U.K. (e-mail: [email protected]; alex.casson@manchest er.ac.uk; [email protected]). Digital Object Identifier 10.1109/JSEN.2021.3078336
Publisher Copyright:
© 2001-2012 IEEE.
PY - 2021/8/1
Y1 - 2021/8/1
N2 - The recognition of gait pattern variation is of high importance to various industrial and commercial applications, including security, sport, virtual reality, gaming, robotics, medical rehabilitation, mental illness diagnosis, space exploration, and others. The purpose of this paper is to study the nature of gait variability in more detail, by identifying gait intervals responsible for gait pattern variations in individuals, as well as between individuals, using cognitive demanding tasks. This work uses deep learning methods for sensor fusion of 116 plastic optical fiber (POF) distributed sensors for gait recognition. The floor sensor system captures spatiotemporal samples due to varying ground reaction force (GRF) in multiples of up to 4 uninterrupted steps on a continuous 2x1 m area. We demonstrate classifications of gait signatures, achieving up to 100% F1-score with Convolutional Neural Networks (CNN), in the context of gait recognition of 21 subjects, with imposters and clients. Classifications under cognitive load, induced by 4 different dual tasks, manifested lower F1-scores. Layer-Wise Relevance Propagation (LRP) methods are employed to decompose a trained neural network prediction to relevant standard events in the gait cycle, by generating a “heat map” over the input used for classification. This allows valuable insight into which parts of the gait spatiotemporal signal have the heaviest influence on the gait classification and consequently, which gait events, such as heel strike or toe-off, are mostly affected by cognitive load.
AB - The recognition of gait pattern variation is of high importance to various industrial and commercial applications, including security, sport, virtual reality, gaming, robotics, medical rehabilitation, mental illness diagnosis, space exploration, and others. The purpose of this paper is to study the nature of gait variability in more detail, by identifying gait intervals responsible for gait pattern variations in individuals, as well as between individuals, using cognitive demanding tasks. This work uses deep learning methods for sensor fusion of 116 plastic optical fiber (POF) distributed sensors for gait recognition. The floor sensor system captures spatiotemporal samples due to varying ground reaction force (GRF) in multiples of up to 4 uninterrupted steps on a continuous 2x1 m area. We demonstrate classifications of gait signatures, achieving up to 100% F1-score with Convolutional Neural Networks (CNN), in the context of gait recognition of 21 subjects, with imposters and clients. Classifications under cognitive load, induced by 4 different dual tasks, manifested lower F1-scores. Layer-Wise Relevance Propagation (LRP) methods are employed to decompose a trained neural network prediction to relevant standard events in the gait cycle, by generating a “heat map” over the input used for classification. This allows valuable insight into which parts of the gait spatiotemporal signal have the heaviest influence on the gait classification and consequently, which gait events, such as heel strike or toe-off, are mostly affected by cognitive load.
KW - Deep Convolutional Neural Networks (CNN), Cognitive Load, Ground Reaction Force (GRF), Sensors Fusion, Interpretable Neural Networks
U2 - 10.1109/JSEN.2021.3078336
DO - 10.1109/JSEN.2021.3078336
M3 - Article
SN - 1530-437X
VL - 21
SP - 16904
EP - 16914
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
IS - 15
M1 - 9425580
ER -