Multimodal Looper: A Live-Looping System for Gestural and Audio-visual Improvisation

Published in Proc MOCO 2024, Utrecht, the Netherlands, 2024

Recommended citation: Pelin Kiliboz and Cumhur Erkut, 2024. “Multimodal Looper: A Live-Looping System for Gestural and Audio-visual Improvisation” in Proc MOCO 2024, Utrecht, The Netherlands

🏡MOCO’24

We present the Multimodal Looper, an embodied digital interface that connects body movements to audiovisual forms for musical improvisation. It extends the performative practice of the musi- cal looper which enables musicians to record and play back multi- ple layers of sound in real-time. With the multimodal looper, we explore music cognition from an embodied perspective, consid- ering cross-modal correspondences in order to achieve an intu- itive method suitable for collaboration. Our goal is to create a live- looping system with multimodal objects that are gesturally acti- vated and visually represented. Our first prototype focused on the essential modules of a live-looping system: for gesture recognition, we used a depth camera and a decision tree; for visuals that corre- spond to distinct categories of sounds, e.g., sustained, iterative, and impulsive, we employed procedural generation techniques such as animated noise, feedback loops, instancing, and various mathemat- ical operations. The system’s effectiveness in establishing cross- modal correspondences for a multisensory experience was evalu- ated through user testing.