Recent developments in Augmented Reality (AR) and Virtual Reality (VR) are opening up new, unprecedented ways of representation and interaction.

Virtual bodies, with an arbitrary behavior, not necessarily depending on physical laws, can be linked to arbitrary functions, spanning from forms of real-time graphic representations to control interfaces and virtual instrument design. Furthermore, those aspects don't have to be assigned to different objects; they can reside in the same body. Therefore, notation, sound, gesture and their interaction can be collapsed together in the same entity, providing new forms of musical expression.

I am exploring those possibilities essentially through three different perspectives:

LINEAR: an environment for real-time composition and performance based on the possibilities offered by the combination of AR and machine learning. The application, used by an iPhone or HTC Vive performer, allows to draw permanent gestures in the air. Those gestures are linked to specific sounds and musical profiles (the results can range from simple melody to advanced real-time sound synthesis). The AR drawings are mirrored to a projector and read as a form of graphic notation by other performers.

SYNESTHESIZER: The SYNESTHESIZER is a timbre synthesizer based on color recognition and Machine Learning. It is inspired by the principle of synesthesia, i.e. when the perceptions through one sense are involuntary extended to other senses. More specifically, we are addressing chromosthesia (also known as sound-to-color synesthesia), i.e. the involuntary experience of a color provoked by a heard sound. The main idea of the SYNESTHESIZER is to use the continuum spectrum of colors in order to explore a continuum spectrum in sounds. Its implementation in VR space will also allow a more engaging interaction, with creation of virtual objects in real-time. The tool is extremely simple and intuitive and open to be used by untrained musicians.

ARScore: This study explores a new way to notate extended techniques on instruments like piano or percussions such as gong, tam-tam and thundersheet. The application allows the representation of gestures (through holograms projected over the instrument’s surface) addressed to specific results. The application allows an easier and extremely more specific way of representing gestures on instruments, avoiding the overcomplex technicality of the notation on paper for what concerns some specific kind of extended techniques.


PORTALE, for live elecgtronics and Augmented Reality environment


In this multimedia composition, the small tam-tam is both a musical instrument and a portal. It is the bridge between virtual and real world. It is the door from which a creature emerges, an agile and impalpable being. The structure of the composition is articulated according to the different forms of relationship the creature establishes with the real performer: it can be a character (moving around and almost acting), a performer (it plays some music by interacting with virtual objects/interfaces), a form of notation ("Follow me" it asks to the performer...and then indicates what to do on the tam-tam); in some cases, it follows the gesture of the performer, as a "tail", drawn in the air. Real and virtual world are here merged in different ways, with different balances between the two different opposites. We see both of the sides of the portal. This piece is the result of a research addressed to the study of new ways for music performance through the use of Augmented Reality, where the combination of real-time 3D graphics and motion capture has the potential to redesign our perception of reality, our representation of it and our capability to interact with the environment. Although differentiated, all the sounds are obtained from the tam-tam (or, in few cases, other metallophones) through multiple forms of sound processing that can distort a timbre until it is not recognizable. Therefore, the sound itself reflects a continuously crafted out relationship between virtual and real. All the sounds and visuals (including camera switches) are generated live.


More information at:

Additional Project Images