Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published June 1, 2020 | Version v1
Conference paper Open

Al-terity: Non-Rigid Musical Instrument with Artificial Intelligence Applied to Real-Time Audio Synthesis

Description

A deformable musical instrument can take numerous distinct shapes with its non-rigid features. Building audio synthesis module for such an interface behaviour can be challenging. In this paper, we present the Al-terity, a non-rigid musical instrument that comprises a deep learning model with generative adversarial network architecture and use it for generating audio samples for real-time audio synthesis. The particular deep learning model we use for this instrument was trained with existing data set as input for purposes of further experimentation. The main benefits of the model used are the ability to produce the realistic range of timbre of the trained data set and the ability to generate new audio samples in real-time, in the moment of playing, with the characteristics of sounds that the performer ever heard before. We argue that these advanced intelligence features on the audio synthesis level could allow us to explore performing music with particular response features that define the instrument's digital idiomaticity and allow us reinvent the instrument in the act of music performance.

Files

nime2020_paper65.mp4

Files (31.2 MB)

Name Size Download all
md5:36583639ff4d50b15dc70edc1902a093
30.3 MB Preview Download
md5:06b10e5c3eb9e54384e1cca8fc08eeac
878.8 kB Preview Download
md5:edff6c3fb7440c223d15c1ebce099e1a
15.2 kB Download

Additional details

Related works

Is part of
2220-4806 (ISSN)