Conference paper Open Access

live coding: sound – gesture – algorithm

Ian Jarvis

Text-based musical live coding (Collins et al, 2003) is approached from the notion of gesture as understood in embodied music cognition and sound-based composition such as to propose a framework for sound, movement and algorithms from a combined embodied-epistemic position. Live coding viewed as an extension of multi-scale studio based sound practices (Roads, 2015) for which human listening and machine listening (Collins, 2015; Van Nort, 2013) are the basis for intervention during the development process; yet positioned within the temporal framework of a performance. The programming language is an interface (Blackwell & Aaron, 2015) to a digital instrumental system that is understood as an epistemic tool (Magnusson,2009) that presumes the potential of various forms of machine agency (Brown, 2016 & 2016b; Bown, 2009) and software agents (Whalley, 2009). The necessary formalism(s) of this digital system sets up the conditions for which human compositional and improvisational actions are complimentary: whatever aspects of the code that are not being improvised in the moment are composed/designed, be it by the performer-programmer(s), or by someone or some software prior. The code that is executed is both descriptive and prescriptive as a score (Magnusson, 2011), while presenting itself for further updates . Bricolage programming describes interactive process of writing and executing code, hearing the output, conceptualizing the next move, and so on, as outlined in the process of action and reaction (McLean & Wiggins, 2010). This understanding of live coding presents a distinct approach to the archetypal notion of sound-producing gesture as grounded in embodied music cognition research and developed in sound-based composition.

Files (84.2 kB)
Name Size
84.2 kB Download
All versions This version
Views 3737
Downloads 2525
Data volume 2.1 MB2.1 MB
Unique views 2929
Unique downloads 2020


Cite as