Movement Through Musical Space as a Generative Model
Authors/Creators
Description
Generative music systems have long sought to create novel musical content, but often struggle to capture the nuanced, embodied experience of musical creation. This paper explores a generative model that conceptualizes music as movement through an abstract space, drawing parallels between physical movement and musical gesture. By examining concepts from performance theory, time metaphors, and movement analysis, the research proposes a system that models musical variation as a form of spatial navigation. The approach leverages human cognitive understanding of movement, intention, and spatial relationships to create a generative system that can produce recognizable musical patterns with complex, real-time variations. The proposed model suggests that by understanding music as a dynamic movement through temporal and abstract spaces, generative systems can more intuitively reflect human musical perception and performance. Preliminary tests indicate that this approach allows for the preservation of underlying musical sequences while generating expressive variations, potentially offering a more embodied and responsive approach to computational music generation.
Files
CMMR2025_P2_5.pdf
Files
(355.4 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:fc6f32c314d8fda6d80dfd6dc6473e43
|
355.4 kB | Preview Download |