Dataset Open Access
Martinez Ramirez, Marco A; Benetos, Emmanouil; Reiss, Joshua D
Accompanying audio samples for the paper:
Martínez Ramírez M. A., Benetos, E. and Reiss J. D., “Modeling plate and spring reverberation using a DSP-informed deep neural network” in the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Barcelona, Spain, May 2020.
Dry and wet bass and guitar recordings.
Bass and Guitar dry notes are taken from the IDMT-SMT-Audio-Effects dataset. Author: Michael Stein (Fraunhofer IDMT) https://www.idmt.fraunhofer.de/en/business_units/m2d/smt/audio_effects.html
Plate Reverb - Bass - recordings are taken from the IDMT-SMT-Audio-Effects dataset. Plate settings are the following:
Spring Reverb - Bass and Guitar - recorded from the spring reverb tank: Accutronics 4EB2C1B: ’Dry Mix - 0%’, ’Wet Mix - 100%’
Plate reverb samples correspond to a VST audio plug-in, while spring reverb samples are recorded using an analog reverb tank which is based on 2 springs placed in parallel.
The recordings are downsampled to 16 kHz. Also, since the plate reverb samples have a fade-out applied in the last 0.5 seconds of the recordings, we process the spring reverb samples accordingly.