Conference paper Open Access

Semi-Automated Mappings for Object-Manipulating Gestural Control of Electronic Music

de las Pozas, Virginia


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="DOI">10.5281/zenodo.4813232</identifier>
  <creators>
    <creator>
      <creatorName>de las Pozas, Virginia</creatorName>
      <givenName>Virginia</givenName>
      <familyName>de las Pozas</familyName>
    </creator>
  </creators>
  <titles>
    <title>Semi-Automated Mappings for Object-Manipulating Gestural Control of Electronic Music</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2020</publicationYear>
  <contributors>
    <contributor contributorType="Editor">
      <contributorName>Michon, Romain</contributorName>
      <givenName>Romain</givenName>
      <familyName>Michon</familyName>
    </contributor>
    <contributor contributorType="Editor">
      <contributorName>Schroeder, Franziska</contributorName>
      <givenName>Franziska</givenName>
      <familyName>Schroeder</familyName>
    </contributor>
  </contributors>
  <dates>
    <date dateType="Issued">2020-06-01</date>
  </dates>
  <resourceType resourceTypeGeneral="ConferencePaper"/>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/4813232</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="ISSN" relationType="IsPartOf">2220-4806</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.5281/zenodo.4813231</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf">https://zenodo.org/communities/nime_conference</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="https://creativecommons.org/licenses/by/4.0/legalcode">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">This paper describes a system for automating the generation of mapping schemes between human interaction with extramusical objects and electronic dance music. These mappings are determined through the comparison of sensor input to a synthesized matrix of sequenced audio. The goal of the system is to facilitate live performances that feature quotidian objects in the place of traditional musical instruments. The practical and artistic applications of musical control with quotidian objects is discussed. The associated object-manipulating gesture vocabularies are mapped to musical output so that the objects themselves may be perceived as DMIs. This strategy is used in a performance to explore the liveness qualities of the system.</description>
  </descriptions>
</resource>
114
75
views
downloads
All versions This version
Views 114114
Downloads 7575
Data volume 35.5 MB35.5 MB
Unique views 9696
Unique downloads 6868

Share

Cite as