Music Controller: Yisong Pu
Credit: Boxiong Zhao, Chu Zhang
On July 9, 2024, our work CrossReality was privileged to be presented in the concert hall of Hanyang University as part of the International Computer Music Conference 2024. For this occasion, we reinterpreted the piece to highlight the dynamic interplay between live body movements and real-time electronic music. In this iteration, we delve into the interconnectedness of the human body, artistic artifacts like music, and the fluid, hybrid context, capturing their resilience and inherent unpredictability.
SETUP
I used a cellphone mount to hold one phone as a fixed reference point, while I held another phone to provide a variable input. This allowed the host computer to calculate the distance and height difference between the two devices, generating real-time music based on these measurements.
At the same time, one of my partners adjusted the sound effects in real-time, responding to the changing dynamics of the stage, further enriching the auditory experience.

— Visual simulation in Unity —