Palle Dahlstedt and Per Anders Nilsson, electronics
<a href="http://onsight.id.gu.se/%7Epalle/pmwiki/pmwiki.php?n=Works.DuoPantoMorf"... pantoMorf</a> play electronic free improv. That is, we perform improvised electronic music as musicians, NOT looking like we check our email on stage. Our main rule is: if we take our hands away, the instruments go quiet. We use no fancy sensors or esoteric gestural controllers, but very basic stuff that we know well how to play. But we develop new ways of playing them, and - most important - new ways of mapping them to sound, using carefully designed sound engines that allows fingertip control, while retaining a vast sonic potential. Every sound relates to and comes directly from a physical gesture by the player, which makes a huge difference for the audience. There are no ongoing pre-programmed processes, and all is free improvisation, mostly non-beat based. If there is a beat, it is played by us. The main question is: How can we explore and control complex electronic sound spaces in improvisation, retaining the millisecond interaction that is taken for granted in acoustic improvisation, but has somehow got lost in electronic music?