Published October 29, 2019 | Version v1
Video/Audio Open

Overview Video of EMPI: The Embodied Musical Predictive Interface

  • 1. Australian National University

Description

This video gives an overview of EMPI, the embodied musical predictive interface.

This interface allows constrained call-and-response interaction with a musical machine-learning model. The performer can control a basic synth sound with one lever. The machine learning model responds with the same sound and another lever controlled by a servo.

We tested the EMPI with three ML models: one based on human-sourced data, as well as a synthetic dataset, and a noise dataset as a control. We also tested the EMPI with the motorized lever enabled and disabled.

This video shows example interactions with each of the six conditions.

Files

Files (77.1 MB)

Name Size Download all
md5:a7c1283b76d2c22065d1b96ff4577afe
77.1 MB Download