Towards Developing a Virtual Guitar Instructor through Biometrics Informed Human-Computer Interaction

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Within the last few years, wearable sensor technologies have allowed us to access novel biometrics that give us the ability to connect musical gesture to computing systems. Doing this affords us to study how we perform musically and understand the process at data level. However, biometric information is complex and cannot be directly mapped to digital systems. In this work, we study how guitar performance techniques can be captured/analysed towards developing an AI which can provide real-time feedback to guitar students.
We do this by performing musical exercises on the guitar whilst acquiring and processing biometric (plus audiovisual) information during their performance. Our results show: there are notable differences within biometrics when playing a guitar scale in two different ways (legato and staccato) and this outcome can be used to motivate our intention to build an AI guitar tutor.

CCS Concepts: • Human-centered computing → Gestural input.

Additional Key Words and Phrases: Deep learning, Biometrics, Musical performance, Guitar, Multimodal data, Game engines, EMG, HCI
Original languageEnglish
Title of host publicationCHI 2022: CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 29 April 2022- 5 May 2022
Publication statusAccepted/In press - 1 Feb 2023


Dive into the research topics of 'Towards Developing a Virtual Guitar Instructor through Biometrics Informed Human-Computer Interaction'. Together they form a unique fingerprint.

Cite this