Conference paper Open Access

Tweaking Parameters, Charting Perceptual Spaces

Iván Paz; Sam Roig Torrubiano

Live coding builds and conducts the sound by real-time intervention of parametric devices (such as synthesizers). Coding a piece on-the-fly requires to bridge the cognitive gap associated with devices’ huge parameter spaces and the possible nonlinear sound variation built-in within them. A possible approach is to have some preselected parameter combinations of which the aural result is known, as a starting point for the performance. However, collecting/memorizing many combinations is time consuming, and using only a few could result in monotony. Therefore, it is convenient to develop models that help to reduce devices’ operational complexity with the aim of easing certain musical tasks, such as the automatic creation of variations. Here, a rulebased approach that models the relationships among combinations of parameter values and perceptual categories assigned to them is described. The extracted rules can be used on-the-fly either to simply reproduce the labeled parameter combinations (by calling them by their class as a set of presets), or to obtain new unheard combinations that the model predicts to be consistent with a selected category. The rules are human-readable and describe how parameter combinations relate to perceptual categories. Concrete examples using the system to select material and create the structure of two pieces are presented and discussed.

Files (466.3 kB)
Name Size
paper94.pdf
md5:12f005bf2830ddeec299d3e2b32510b8
466.3 kB Download
23
15
views
downloads
All versions This version
Views 2323
Downloads 1515
Data volume 7.0 MB7.0 MB
Unique views 1818
Unique downloads 1313

Share

Cite as