Interactive music system based on pitch quantization and tonal navigation
Description
Recent developments in the field of human computer interaction have led to new ways of making music using digital instruments. The new set of sensors available in devices such as smartphones and touch tablets propose an interesting challenge in the field of music technology regarding how they can be used in a meaningful and musical way. One to one control over parameters such as pitch, timbre and amplitude is no longer required thus making possible to play music at a different level of abstraction. Lerdahl and Jackendoff presented in their publication “Generative theory of tonal music” a perspective that combines Heinrich Schenker’s theory of music and Noam Chomsky linguistics to explain how tonal music is organized and structured. This idea opens up the possibility of investigating ways to manipulate a “wider” aspect of music through an assisted interactive musical system that takes into account the “principles” or common knowledge used in music composition and performance. Numerous publications exist in the field of music theory regarding how the tonal aspect of music works. Due to aspects related to psychoacoustics and cultural inheritance, there is a certain common base on how to form and use chords and scales, and how they are organized in terms of hierarchies, movement tendencies and tonal functions. This project deals with the design of a graphical representation about the use of tonality in jazz and popular music, organizing chords and scales in a hierarchical, semanticall and practical way. This representation, which we call “Tonal map”, is then used as the graphical interface of an interactive system to manipulate different data types in realItime. The process consists in adjusting note values (pitches) using a set of rules that characterize each sector of the graphical representation. The output produced by system is the “tonally quantized” version of the input data using as transformation parameter the set of rules that the user chooses through the graphical interface. The graphical model was constructed using different methods for learning modern harmony and taking also into account models of music perception and cognition. The interactive system was implemented through a Max for Live patch that works as a MIDI effect. Input data is first transform into MIDI note events that are modified according to the users selection in the tonal map. The hardware interface uses a Wacom tablet with a printed version of the tonal map. The interaction can be seen as a tonality selector for real time transformations of MIDI data. The prototype can also be used with several inputs at the same time allowing collaborative playing
Files
2011-Aldrey-Leonardo-Master-thesis.pdf
Files
(9.7 MB)
Name | Size | Download all |
---|---|---|
md5:1ec84b0dbb6c169bc48e69ddcf116970
|
9.7 MB | Preview Download |