Published May 31, 2023
| Version v1
Conference paper
Open
MuGeVI: A Multi-Functional Gesture-Controlled Virtual Instrument
Creators
Description
Currently, most of the digital musical instruments cannot leave the use of dedicated hardware devices, making them limited in terms of user popularity and resource conservation. In this paper, we propose a new computer vision-based interactive multi-functional musical instrument, called MuGeVI, which requires no additional hardware circuits or sensors, and allows users to create or play music through different hand gestures and positions. It firstly uses deep neural network models for hand key point detection to obtain gesture information, secondly maps it to pitch, chord or other information based on the current mode, then passes it to Max/MSP via the OSC protocol, and finally implements the generation and processing of MIDI or audio. MuGeVI is now available in four modes: performance mode, accompaniment mode, control mode, and audio effects mode, and can be conveniently used with just a personal computer with a camera. Designed to be human-centric, MuGeVI is feature-rich, simple to use, affordable, scalable and programmable, and is certainly a frugal musical innovation. All the material about this work can be found in https://yewlife.github.io/MuGeVI/.
Files
nime2023_75.pdf
Files
(1.3 MB)
Name | Size | Download all |
---|---|---|
md5:910162d9d9c0939f67273baa95003b97
|
1.3 MB | Preview Download |