Published May 27, 2016
| Version v1
Conference paper
Open
Processing of symbolic music notation via multimodal performance data: Brian Ferneyhough's Lemma-Icon-Epigram for solo piano, phase 1
Creators
Contributors
Editors:
Description
In the "Performance Notes" to his formidable solo piano work Lemma-Icon-Epigram, British composer Brian Ferneyhough proposes a top-down learning strategy: Its first phase would consist in an "overview of gestural patterning", before delving into the notorious rhythmic intricacies of this most complex notation. In the current paper, we propose a methodology for inferring such patterning from multimodal performance data. In particular, we have a) conducted qualitative analysis of the correlations between the performance data -an audio recording, 12-axis acceleration and gyroscope signals captured by inertial sensors, kinect video and MIDI- and the implicit annotation of pitch during a 'sight-reading' performance; b) observed and documented the correspondence between patterns in the gestural signals and patterns in the score annotations and c) produced joint tablature-like representations, which inscribe the gestural patterning back into the notation, while reducing the pitch material by 70-80% of the original. In addition, we have incorporated this representation in videos and interactive multimodal tablatures using the INScore. Our work is drawing from recent studies in the fields of gesture modeling and interaction. It is extending the authors' previous work on an embodied model of navigation of complex notation and on an application for offline and real-time gestural control of complex notation by the name GesTCom. Future prospects include the probabilistic modeling of gesture-to-notation mappings, towards the design of interactive systems which learn along with the performer while cutting through textual complexity.
Files
18_Antoniadis_tenor2016.pdf
Files
(2.5 MB)
Name | Size | Download all |
---|---|---|
md5:6ef8477e767f389f59602ae844920f90
|
2.5 MB | Preview Download |