Published July 29, 2024 | Version v1
Dataset Open

EMOPIA+

  • 1. ROR icon University of California, San Diego
  • 2. ROR icon National Taiwan University

Description

EMOPIA+ is a dataset published with paper "Emotion-driven Piano Music Generation via Two-stage Disentanglement and Functional Representation". It is an extension of EMOPIA, a multi-modal (audio and MIDI) database focusing on perceived emotion in pop piano music. To support two-stage generation, we extracted lead sheet (meloly + chord progressions) from original midi files, and manually corrected the key signatures of 367 clips. This dataset comprises 1,071 processed MIDI files along with their event-based representations: functional representation and REMI.

For more details about the extraction algorithms and the process to generate events, see the repo EMO-Disentanger

File Description

  • midis/: Processed MIDI clips containing four tracks: Melody, Texture, Bass, and Chord. The first three tracks can be merged into the original MIDI files.
  • adjust_keyname.json: Manually adjusted key signatures.
  • functional/: Functional representations of the MIDI files.
    • performance/: Events for performance, including Melody, Texture and Bass tracks.
    • lead_sheet/: Events for lead sheet, including Melody and Chord tracks.
    • lead_sheet_to_performance/: Events for both lead sheet and performance, facilitating conditional music generation. 
  • REMI/: REMI representations of the MIDI files, with the same structure as the functional/.
  • split/: train/val/test splits based on the instructions in EMOPIA.

Citation

@inproceedings{{EMOPIA},
  author = {Hung, Hsiao-Tzu and Ching, Joann and Doh, Seungheon and Kim, Nabin and Nam, Juhan and Yang, Yi-Hsuan},
  title = {{EMOPIA}: A Multi-Modal Pop Piano Dataset For Emotion Recognition and Emotion-based Music Generation},
  booktitle = {Proceedings of the International Society for Music Information Retrieval Conference, {ISMIR}},
  year = {2021}
}

@inproceedings{emodisentanger2024, author = {Jingyue Huang and Ke Chen and Yi-Hsuan Yang}, title = {Emotion-driven Piano Music Generation via Two-stage Disentanglement and Functional Representation}, booktitle = {Proceedings of the International Society for Music Information Retrieval Conference, {ISMIR}}, year = {2024} }

Files

EMOPIA+.zip

Files (14.8 MB)

Name Size Download all
md5:821f6bb615053ac2e7171922198b58e2
14.8 MB Preview Download

Additional details

Related works

Is derived from
Dataset: 10.5281/zenodo.5090630 (DOI)