Listening to Your Brain : Implicit Interaction in Collaborative Music Performances
Description
The use of physiological signals in Human Computer Interaction (HCI) is becoming popular and widespread, mostlydue to sensors miniaturization and advances in real-timeprocessing. However, most of the studies that use physiologybased interaction focus on single-user paradigms, and itsusage in collaborative scenarios is still in its beginning. Inthis paper we explore how interactive sonification of brainand heart signals, and its representation through physicalobjects (physiopucks) in a tabletop interface may enhancemotivational and controlling aspects of music collaboration.A multimodal system is presented, based on an electrophysiology sensor system and the Reactable, a musical tabletop interface. Performance and motivation variables wereassessed in an experiment involving a test "Physio" group(N=22) and a control "Placebo" group (N=10). Pairs ofparticipants used two methods for sound creation: implicitinteraction through physiological signals, and explicit interaction by means of gestural manipulation. The resultsshowed that pairs in the Physio Group declared less difficulty, higher confidence and more symmetric control thanthe Placebo Group, where no real-time sonification was provided as subjects were using pre-recorded physiological signal being unaware of it. These results support the feasibilityof introducing physiology-based interaction in multimodalinterfaces for collaborative music generation.
Files
nime2011_149.pdf
Files
(2.2 MB)
Name | Size | Download all |
---|---|---|
md5:232e0cc1d16478b5ea9d460eaceeb9d6
|
2.2 MB | Preview Download |