Published June 1, 2013
| Version v1
Conference paper
Open
Multi Sensor Tracking for Live Sound Transformation
Creators
Description
This paper demonstrates how to use multiple Kinect(TM) sensors to map aperformers motion to music. We merge skeleton data streams from multiplesensors to compensate for occlusions of the performer. The skeleton jointpositions drive the performance via open sound control data. We discuss how toregister the different sensors to each other and how to smoothly merge theresulting data streams and how to map position data in a general framework tothe live electronics applied to a chamber music ensemble.
Files
nime2013_44.pdf
Files
(2.4 MB)
Name | Size | Download all |
---|---|---|
md5:6d9e5cee8f928582ad1f0dbbc161f343
|
2.4 MB | Preview Download |