This is the second of two days of public talks and discussions on HCI and gesture-related research in computer music, jointly hosted by CNMAT and the Berkeley Institute of Design (BID).

***Note: This event takes place in Jacobs Hall, room 210***

Program:

Jérémie Garcia (Goldsmiths, University of London):
Interactive tools to support music composition

Music composition is a highly creative activity that usually involves a combination of writing on paper, instrumental playing, whether physical or digital and interaction with computer-aided environments. These environments feature advanced computational possibilities but are much more limited in terms of interaction. Designing computer-aided composition tools not only requires focusing on technical aspects such as enhancing quick input of musical scores or improving sound synthesis algorithms, but it also needs to support the creative aspects of music composition. We still need models and tools tailored for the inherent complexity of music composition that could help composers iteratively design and assess their musical ideas.
In this presentation, I will describe the work conducted in close collaboration with composers for understanding the composition process and design new interactive tools able to support their creative needs. A first part will focus on interactive paper, a technology that allows to capture notes written with a pen on paper, a a way to extend existing computer-aided composition tools to provide additional space for expression and exploration of musical ideas. A second part will focus on production and manipulation of temporal control data dedicated to sound spatialisation, a technique that create the illusion that sound sources come from various directions in space.

Jules Françoise (Simon Fraser University)
Machine Learning for User-Centered Motion-Sound Interaction Design

Designing the relationship between motion and sound is essential to the creation of interactive systems for music, performing arts and somatic practices. This presentation will outline the Mapping-by-Demonstration approach: a conceptual, experiential and computational framework allowing users to craft interactive technologies through embodied demonstrations of personal associations between movement and sound. We will present concepts, models, and application of this framework for interaction sonification, and discuss the potential and challenges of machine learning for user-centered design rather than problem solving.

=========

Jérémie Garcia is a research and teaching postdoctoral fellow in the computing department at the University of Goldsmiths in London, UK. He is interested in studying and designing interactive systems able to actively support creative users in their work. He worked at Inria and Ircam on interactive paper to combine free expression on paper with computer-aided composition tools and on interactive tools for controlling sound spatialisation.

Jules Françoise is a postdoctoral researcher at the School of Interactive Arts and Technology (SIAT) at Simon Fraser university (SFU). His research intersects machine learning and human-computer interaction for expressive movement analysis, recognition, synthesis, and mapping to other modalities in interactive systems. He holds a PhD in computer science from Université Pierre et Marie Curie, that he completed in the {Sound Music Movement} Interaction team at Ircam. He is member of the steering committee of the International Workshop on Movement and Computing (MOCO).

Add to iCal
Find on Google Maps
Share

Friday, May 13, 2016, 6:30pm to 8:00pm
Attachments