3D free reaching movement prediction of upper-limb based on deep neural networks

Chao Wang, Manoj Sivan, Tianzhe Bao, Guqiang Li, Shengquan Xie

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

Quantitative assessment of motor disorder is one of the main challenges in the field of stroke rehabilitation. This paper proposes a simplified kinematic model for human upper limb(UL) using seven main joints of both the dominant and non-dominant side. With this model, a deep neural network (DNN) is used to predict the 3D free reaching movement of UL of a healthy participant. The experimental results show that the prediction trajectories can achieve high similarities with trajectories of real movements, indicating the promising accuracy in 3D movement estimation of UL achieved by the DNN. With the capability of identifying specific reaching movements in realtime, the trajectories predicted by this data-driven model can be utilized to inform the rehabilitation assessment and training in the future studies as a personalized therapy approach.

Original languageEnglish
Title of host publication2021 10th International IEEE/EMBS Conference on Neural Engineering, NER 2021
PublisherIEEE Computer Society
Pages1005-1009
Number of pages5
ISBN (Electronic)9781728143378
ISBN (Print)9781728143378
DOIs
Publication statusPublished - 4 May 2021
Event10th International IEEE/EMBS Conference on Neural Engineering, NER 2021 - Virtual, Online, Italy
Duration: 4 May 20216 May 2021

Publication series

Name2021 10TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER)
ISSN (Print)1948-3546

Conference

Conference10th International IEEE/EMBS Conference on Neural Engineering, NER 2021
Country/TerritoryItaly
CityVirtual, Online
Period4/05/216/05/21

Keywords

  • 3D
  • Data-driven
  • Movement prediction
  • Rehabilitation assessment
  • Upper limb

Fingerprint

Dive into the research topics of '3D free reaching movement prediction of upper-limb based on deep neural networks'. Together they form a unique fingerprint.

Cite this