This research project utilises electronic touch-screen instruments in a multi-channel performance space to explore a formal language for describing open-ended musical composition. Our system encodes concepts of coordination, opposition, synchronisation, communication, bandwidth and delay. Our software system allows representation of formally specified compositions which can enable musical performances or data sonifications that explore these concepts.
Open-form composition allows for a spectrum of engagement from fully specified to almost completely improvised performance. Our work is novel in that we capture concepts of adversarial action within a shared musical score mediated by a computer system and that we explore reduction and time-shifting of shared information to represent bandwidth and delay. Shared open-form compositions with computer displays have been previously explored (eg, the work of Lindsay Vickery and Decibel more generally), but our work extends this concept to implement a formal language for generating these scores and software for mediating communication during performance.
We explore the properties of these generated scores through live test performances with human and artificial performers with computer- generated sounds. These experiences allow the quality of our system to be measured in terms of the breadth of synchronised and coordinated experiences that can be represented and explored through our compositional language. Our system could also support non-musicians exploring open-form music through game-like performances to learn about collaboration and synchronisation in real-time.