Published October 17, 2025 | Version v1
Dataset Open

MUSMET - FIrst version of multimodal dataset

Description

This dataset contains the first version of multimodal data, acquired for the MUSMET project (funded under Horizon Europe Framework Program (HORIZON), under the grant agreement 101184379) For futher information about the project visit https://musmet.eu/ 

Data of five groups of four musicians were acquired (for a total of 20 musicians, 10 males and 10 females); each group played the four instruments: piano, electric bass, electric guitar, and drums. Each group played a total of 8 pieces of music, each having one of four emotions, which were repeated twice in randomized order across the groups: angry, sad, happy and relaxed. 

The data within the .xdf structure is stored according to the conventions of the LabStreamingLayer (LSL), an open-source framework designed to store the collected data with accompanying timestamps. These synchronized streams include audio data and EEG signals. Essentially, the .xdf file is the repository for all synchronized, multi-modal data generated by the Recording Suite. For each recording, two streams of four musicians are recorded, resulting in eight streams collected into the .xdf file. 

Files

MUSMET - First version of multimodal dataset.zip

Files (22.9 GB)

Name Size Download all
md5:509b188403e8d9c3d52a547fa9c2cac2
22.9 GB Preview Download

Additional details

Funding

European Commission
MUSMET - Musical Metaverse made in Europe: an innovation lab for musicians and audiences of the future 101184379