Deep learning and sensor fusion methods for cognitive load gait difference in males and females

Abdullah S. Alharthi, Krikor B. Ozanyan

Research output: Contribution to conferencePaperpeer-review

238 Downloads (Pure)

Abstract

Human gait is the manner of walking in people. It is influenced by weight, age, health condition or the interaction with the surrounding environment. In this work, we study gait changes under cognitive load in healthy males and females, using machine learning methods. A deep learning model with multi-processing pipelining and back propagation techniques, is proposed for cognitive load gait analysis. The IMAGiMAT floor system enabling sensor fusion from plastic optical fiber (POF) elements, is utilized to record gait raw data on spatiotemporal ground reaction force (GRF). A deep parallel Convolutional Neural Network (CNN) is engineered for POF sensors fusion, and gait GRF classification. The Layer-Wise Relevance Propagation (LRP), is applied to reveal which gait events are relevant towards informing the parallel CNN prediction. The CNN differentiates between males and females with 95% weighted average precision, and cognitive load gait classification with 93% weighted average precision. These findings present a new hypothesis, whereas larger dataset holds promise for human activity analysis.
Original languageEnglish
Pages229-237
Number of pages9
Publication statusPublished - 11 Nov 2019
EventIntelligent Data Engineering and Automated Learning - Manchester, United Kingdom
Duration: 14 Nov 201916 Nov 2019

Conference

ConferenceIntelligent Data Engineering and Automated Learning
Abbreviated titleIDEAL 2019
Country/TerritoryUnited Kingdom
CityManchester
Period14/11/1916/11/19

Keywords

  • Convolutional Neural Networks (CNN)
  • Cognitive load gait
  • Ground Reaction Force (GRF)
  • Layer-Wise Relevance Propagation (LRP)

Fingerprint

Dive into the research topics of 'Deep learning and sensor fusion methods for cognitive load gait difference in males and females'. Together they form a unique fingerprint.

Cite this