What new possibilities do technological advances offer for our listening experience? The research project MetaPhase focuses on interaction between human and virtual musicians in a new world encompassing both real and virtual space. As part of the European STARTS 2023 Award, the research has just received an honourable mention from the European Commission. MetaPhase was selected from over 1,600 applications.
What is remarkable about this project is that the performers’ movements and gestures are captured with high accuracy by a motion tracking system and embodied on screen in the form of a digital avatar. This allows the audience to be immersed in every detail of the relationship between gesture and sound.
MetaPhase is the artistic result of a partnership between Giusy Caruso and the innovative start-up LWT3 from Italy. Caruso is a concert pianist and postdoctoral researcher dedicated to the study of human-machine interaction (HMI) for the creation of futuristic multimedia formats and the analysis of musical performances. Chairwoman of the CREATIE research group at the Royal Conservatory of Antwerp and affiliate researcher at IPEM-Ghent University, she is official music advisor of LWT3 Society Milan. LWT3 focuses on data analysis, visualisation, IoT infrastructure development and human-machine interaction solutions.
creative potential
This joint research was driven to capture the creative potential of data processing, human-machine interaction and biotech applications in an XR performance. The aim was to enhance the expressiveness of the performers and the listening experience of the audience. At the core is a portable, user-friendly system (prototype), developed by LWT3, capable of collecting biosignals.
The system maps gestures and provides quantitative data on displacement, acceleration and speed of movements. In this way, musicians obtain valuable information about their physical approach when playing an instrument. As a result, the musician knows better where opportunities for improvement present themselves.
Based on musicians’ movements, the system also creates a virtual agent (usually an avatar) that moves on stage. The musicians are fitted with reflective markers on their bodies and a biosensor that, via infrared cameras, capture the gestures and the corresponding muscle effort in real time. In summary, this form of co-creation of live music includes the first experiments with OptiTrack motion tracking, LWT3’s biometric signalling devices, and VR technology and the use of a Yamaha Disklavier piano.
From this perspective of visualising physical measurements, Giusy Caruso and LWT-3 developed the idea of creating digitised performances as well. For example, she plays Steve Reich’s Piano Phase for two pianos by interacting with another avatar pianist who plays the first part of the piece. This second virtual human is animated by the expressive movement of the real pianist previously recorded along with an audio track on a Disklavier piano.

Dr. Giusy Caruso, concert pianist and researcher Royal Conservatoire Antwerp (credit: Giusy Caruso)
The project highlights that we are in an era where there is an increasing focus on hybrid approaches (physical+digital) in hybrid spaces (real+virtual). Against their better judgment, people have become accustomed to living in virtual environments and communicating via smartphones or laptops. With the development of AR/VR projections, people can now also create avatar interactions in the ‘metaverse’ (a virtual world with avatars and tokens). This metaverse is only at the beginning of a development that still has many mysteries and uncertainties but is developing step by step.
Photo heading: Wannes Cré