
Description: ReverbFX is a dataset of 1,846 room impulse responses (RIRs) designed to advance research in singing voice dereverberation, with a special focus on artificial reverberation used in music production. Unlike conventional RIR datasets recorded in real acoustic spaces or simulated via physical models, ReverbFX is derived exclusively from a diverse collection of professional reverb audio effect plugins, including Protoverb, SkyNet, TAL-Reverb-4, and Valhalla Supermassive. Each RIR was generated by processing Dirac impulses through these plugins with factory and custom presets (randomized for further diversity). The resulting RIRs cover a wide range of decay times (RT60 spanning 0.31 to 52.08 seconds) and reverberation characteristics, capturing the creative breadth of artificial reverbs encountered in contemporary music production. A stringent validation protocol was applied to ensure data quality, checking for numerical integrity, DC offset, signal energy, plausible RT60 range, envelope decay, and early energy concentration. Key Features: - 1,846 plugin-derived RIRs - Four widely used reverb plugins (see paper for details) - Wide range of reverberation times and characteristics - Fully open metadata and link to corresponding plugin settings - All audio files share the same specifications: - Duration: 00:10 (10 seconds) - Audio channels: Stereo - Sample rate: 48 kHz - Bits per sample: 32-bit float - Designed for benchmarking and developing dereverberation and audio enhancement models for creative scenarios Data Structure:The dataset is organized as follows:ReverbFX├── LICENCE.txt├── meta│ ├── test.csv│ ├── train.csv│ └── valid.csv├── test│ ├── Protoverb_100_03.wav│ ├── Protoverb_105_00.wav│ ... │ └── Valhalla_WideVocalSwell_10.wav├── train│ ├── Protoverb_100_01.wav│ ├── Protoverb_100_02.wav│ ...│ └── Valhalla_WideVocalSwell_14.wav└── valid ├── Protoverb_103_01.wav ├── Protoverb_106_04.wav ... └── Valhalla_WideVocalSwell_04.wav - LICENCE.txt: License information- meta/: Metadata and split definitions (CSV files for train/valid/test)- train/, valid/, test/: Folders containing RIRs (WAV files), named by plugin and preset - All WAV files: 10 seconds stereo, 48 kHz sample rate, 32-bit float Use Cases: - Training and evaluation of machine learning models for dereverberation- Benchmarking source separation and audio enhancement technologies in artificial reverb scenarios- Research in music information retrieval and creative audio processing License: Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0). Non-commercial research and educational use only. For details, see LICENCE.txt. Attribution: Please cite the original paper in any derivative works or publications: Julius Richter, Till Svajda, Timo Gerkmann, "ReverbFX: A Dataset of Room Impulse Responses Derived from Reverb Effect Plugins for Singing Voice Dereverberation," ITG Conference on Speech Communication, Berlin, Germany, Sept. 2025. Please acknowledge the original plugin developers. Contact: For questions or collaboration inquiries, contact {julius.richter,timo.gerkmann}@uni-hamburg.de. Link: Project website: https://sp-uhh.github.io/reverbfx
reverberation, room impulse response, audio effect plugins, dereverberation, reverb
reverberation, room impulse response, audio effect plugins, dereverberation, reverb
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
