Conference paper Open Access

MINDMIX: Mapping of brain activity to congruent audio mixing features

Williams, Duncan A.H.

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:contributor>Michon, Romain</dc:contributor>
  <dc:contributor>Schroeder, Franziska</dc:contributor>
  <dc:creator>Williams, Duncan A.H.</dc:creator>
  <dc:description>Brain-computer interfacing (BCI) offers novel methods to facilitate participation in audio engineering, providing access for individuals who might otherwise be unable to take part (either due to lack of training, or physical disability). This paper describes the development of a BCI system for conscious, or 'active', control of parameters on an audio mixer by generation of synchronous MIDI Machine Control messages. The mapping between neurophysiological cues and audio parameter must be intuitive for a neophyte audience (i.e., one without prior training or the physical skills developed by professional audio engineers when working with tactile interfaces). The prototype is dubbed MINDMIX (a portmanteau of 'mind' and 'mixer'), combining discrete and many-to-many mappings of audio mixer parameters and BCI control signals measured via Electronecephalograph (EEG). In future, specific evaluation of discrete mappings would be useful for iterative system design.</dc:description>
  <dc:title>MINDMIX: Mapping of brain activity to congruent audio mixing features</dc:title>
All versions This version
Views 5555
Downloads 2626
Data volume 7.6 MB7.6 MB
Unique views 4343
Unique downloads 2222


Cite as