Abstract
Semiconductor is a piece for virtual reality (VR), game-engine (mixed reality – MR), live electronics and performer. The piece is performed by a physical performer wearing biometric sensors on their arms, in order to directly interact with digital worlds (in both VR and MR), and is split into two movements. The sensors worn by the performer retrieve biometric data regarding how the performer’s arms move and how the muscles within their arms behave. This biometric information is processed and made usable through current machine learning (ML) methods, which allows nuanced and sophisticated human-computer interaction (HCI) to occur.
The first movement of the piece (namely, Musica Tangibile) is performed exclusively within VR and allows the performer to explore how biometric information, pertaining to muscle amplitude of the arms (i.e., electromyographic data), can be used as a novel layer of information within VR for musical and creative outcomes. The performer does this by exciting a musical ‘atom’, which is an extension of their biometric presence within VR. The second movement (Semiconductor) is performed in front of a screen showing the game-engine elements of the work, which permits the performer to play an abstract digital violin through sophisticated HCI methods. As a result, the physical performer is able to manipulate digital violin pegs and conduct them to develop interesting sound worlds and compositional devices.
The first movement of the piece (namely, Musica Tangibile) is performed exclusively within VR and allows the performer to explore how biometric information, pertaining to muscle amplitude of the arms (i.e., electromyographic data), can be used as a novel layer of information within VR for musical and creative outcomes. The performer does this by exciting a musical ‘atom’, which is an extension of their biometric presence within VR. The second movement (Semiconductor) is performed in front of a screen showing the game-engine elements of the work, which permits the performer to play an abstract digital violin through sophisticated HCI methods. As a result, the physical performer is able to manipulate digital violin pegs and conduct them to develop interesting sound worlds and compositional devices.
Original language | English |
---|---|
Place of Publication | Online |
Publisher | Royal Northern College of Music |
Edition | Future Music 3 |
Media of output | Online |
Publication status | Published - 16 Jun 2021 |
Event | Future Music 3 - Online, Manchester, United Kingdom Duration: 16 Jun 2021 → 17 Jun 2021 https://www.rncm.ac.uk/research/research-centres-rncm/prism/prism-events/future-music-3-16-17-june-2021/ |
Keywords
- Interactive Music
- Music Composition
- Machine Learning
- Artificial Intelligence
- ML
- AI
- Biometrics
- EMG
- IMU
- Electromyographic