New Interfaces for Classifying Performance Gestures in Music

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

169 Downloads (Pure)

Abstract

Interactive machine learning (ML) allows a music performer to digitally represent musical actions (via gestural interfaces) and affect their musical output in real-time. Processing musical actions (termed performance gestures) with ML is useful because it predicts and maps often-complex biometric data. ML models can therefore be used to create novel interactions with musical systems, game-engines, and networked analogue devices. Wekinator is a free opensource software for ML (based on the Waikato Environment for Knowledge Analysis – WEKA - framework) which has been widely used, since 2009, to build supervised predictive models when developing real-time interactive systems. This is because it is accessible in its format (i.e. a graphical user interface – GUI) and simplified approach to ML. Significantly, it allows model training via gestural interfaces through demonstration. However, Wekinator offers the user several models to build predictive systems with. This paper explores which ML models (in Wekinator) are the most useful for predicting an output in the context of interactive music composition. We use two performance gestures for piano, with opposing datasets, to train available ML models, investigate compositional outcomes and frame the investigation. Our results show ML model choice is important for mapping performance gestures because of disparate mapping accuracies and behaviours found between all Wekinator ML models.
Original languageEnglish
Title of host publicationIntelligent Data Engineering and Automated Learning – IDEAL 2019 - 20th International Conference, Proceedings
EditorsHujun Yin, Richard Allmendinger, David Camacho, Peter Tino, Antonio J. Tallón-Ballesteros, Ronaldo Menezes
Pages31-42
Number of pages12
DOIs
Publication statusPublished - 2019
EventIntelligent Data Engineering and Automated Learning - Manchester, United Kingdom
Duration: 14 Nov 201916 Nov 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11872 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceIntelligent Data Engineering and Automated Learning
Abbreviated titleIDEAL 2019
Country/TerritoryUnited Kingdom
CityManchester
Period14/11/1916/11/19

Keywords

  • Gestural interfaces
  • HCI
  • Interactive machine learning
  • Interactive music
  • Myo
  • Performance gestures
  • Wekinator

Fingerprint

Dive into the research topics of 'New Interfaces for Classifying Performance Gestures in Music'. Together they form a unique fingerprint.

Cite this