Published February 26, 2026 | Version Perceptual_Learning
Software Open

hesamazj/MusicianHand: Musician Hand's Repository

Authors/Creators

  • 1. University of Southern California

Description

Project Overview

The "Musician Hand" project represents an end-to-end learning system designed to imbue a robot with a musical skill. Instead of explicitly programming the robot's movements for each note, we leverage principles of self-supervised learning:

Exploratory Babbling: The robot performs random, diverse actions (limb activations) for a short period (2 minutes). Perception-Action Data Collection: During babbling, the system simultaneously records the robot's commanded actions and its sensory output (e.g., the sound produced by its movements). Inverse Model Learning: A neural network is trained to learn the inverse dynamics – mapping desired musical outputs (e.g., specific note characteristics from a melody's spectrogram) back to the required robot limb activations that produced those sounds during babbling. Melody Playback (Deployment): Once the inverse model is trained, a target 4-note melody is fed into the model, which then predicts the necessary robot activations to reproduce the melody. These activations are sent to the robot in real-time. This approach simulates how biological systems learn motor skills through exploration and sensory feedback, enabling the robot to "understand" how its actions translate into musical outcomes.

Files

hesamazj/MusicianHand-Perceptual_Learning.zip

Files (33.0 MB)

Name Size Download all
md5:a89372181b0f61dac230054a3a873b9c
33.0 MB Preview Download

Additional details

Related works