SHOALZ is a duo delivering a 30 minute live Telematic Music Performance (TMP), between London (UK), and Perth (Aust). Two musicians (Matt Bray [Aust] and Bernardo Varela [UK]) located continents apart, co-create a `comprovised’ Electro Dub piece that also generates an immersive, reactive visual feed that can be viewed through a VR headset or an online stream. The ethos is to demonstrate the capacity to create immersive, paradigm shifting artworks together, despite disparate locations. Performers share performance data over The Internet to co-create a singular piece of rhythmically syncopated music, complimented with a detailed visual feed available to remote viewers. The aesthetics are aqua, ether, unity and transcendence. Telemidi is a targeted approach to MIDI network design with an explicit aim to minimise the obstruction of latency within live TMP events across a Wide Area Network (WAN i.e. the Internet). Undertaking PhD research at WAAPA (ECU, Perth, Aust.), Matt Bray has developed a live performance environment that also harnesses MIDI to generate hi resolution reactive visuals (designed by Bernardo Varela) that feeds into VR headsets in real-time. This research seeks to identify the process of cross-cultural cooperation within the emergent TMP paradigm. Critically, the process of sharing time sensitive music performance information over The Internet exposes data to latencies that disrupt the millisecond timing of human-to-human musical intercourse, therefore attaining successful TMP environments has proven to be overwhelmingly elusive. Telemidi networks exchange only MIDI information to benefit from the relatively miniature data packet sizes and the multifaceted capacity of this digital protocol.
Sharing MIDI over the Internet via RTP software introduces latency between events and the resultant audio generation. The latency budget in live music performance sits at around 30msec, however WAN ping times between most capital cities breach this by multiple factors. As a focus of my PhD research, I have employed numerous strategies to minimise and dampen the obstruction of latency. Through numerous iterations of this process with my London collaborator Bernardo Varela, we have also introduced a complex reactive visual environment of over 15,000 particles that move and change in accordance to specific MIDI notes and actions. These visuals can be viewed in sync with our generated audio via online streaming and through a VR headset. Ultimately, Telemidi networking systems deliver immersive, collaborative and engaging live music performance environments regardless of your location on the planet.