Conference paper Open Access

MINDMIX: Mapping of brain activity to congruent audio mixing features

Williams, Duncan A.H.


Citation Style Language JSON Export

{
  "publisher": "Zenodo", 
  "DOI": "10.5281/zenodo.4813408", 
  "container_title": "Proceedings of the International Conference on New Interfaces for Musical Expression", 
  "title": "MINDMIX: Mapping of brain activity to congruent audio mixing features", 
  "issued": {
    "date-parts": [
      [
        2020, 
        6, 
        1
      ]
    ]
  }, 
  "abstract": "Brain-computer interfacing (BCI) offers novel methods to facilitate participation in audio engineering, providing access for individuals who might otherwise be unable to take part (either due to lack of training, or physical disability). This paper describes the development of a BCI system for conscious, or 'active', control of parameters on an audio mixer by generation of synchronous MIDI Machine Control messages. The mapping between neurophysiological cues and audio parameter must be intuitive for a neophyte audience (i.e., one without prior training or the physical skills developed by professional audio engineers when working with tactile interfaces). The prototype is dubbed MINDMIX (a portmanteau of 'mind' and 'mixer'), combining discrete and many-to-many mappings of audio mixer parameters and BCI control signals measured via Electronecephalograph (EEG). In future, specific evaluation of discrete mappings would be useful for iterative system design.", 
  "author": [
    {
      "family": "Williams, Duncan A.H."
    }
  ], 
  "id": "4813408", 
  "event-place": "Birmingham, UK", 
  "type": "paper-conference", 
  "event": "International Conference on New Interfaces for Musical Expression", 
  "page": "349-352"
}
55
26
views
downloads
All versions This version
Views 5555
Downloads 2626
Data volume 7.6 MB7.6 MB
Unique views 4343
Unique downloads 2222

Share

Cite as