Published July 2, 2020 | Version v1
Dataset Open

Spherical Headgear HRIR Compilation of the Neumann KU100 and the Head acoustics HMS II.3

  • 1. TH Köln, Institute of Communications Engineering, Cologne, Germany
  • 2. TH Köln, Institute of Communications Engineering, Cologne, Germany; TU Berlin, Audio Communication Group, Berlin, Germany.
  • 3. TH Köln, Institute of Media and Imaging Technology, Cologne, Germany


[1] C. Pörschmann, J. M. Arend, and R. Gillioz, “How wearing headgear affects measured head-related transfer functions,” in Proceedings of the EAA Spatial Audio Signal Processing Symposium, 2019, pp. 49–54.
DOI link:

Files also available at  in SOFA file format at


The spatial representation of sound sources is an essential element of virtual acoustic environments (VAEs). When determining the sound incidence direction, the human auditory system evaluates monaural and binaural cues, which are caused by the shape of the pinna and the head. While spectral information is the most important cue for elevation of a sound source, we use differences between the signals reaching the left and the right ear for lateral localization. These binaural differences manifest in interaural time differences (ITDs) and interaural level differences (ILDs). In many headphone-based VAEs, head-related transfer functions (HRTFs) are used to describe the sound incidence from a source to the left and right ear, thus integrating both monaural and the binaural cues. Specific aspects, like for example the individual shape of the head and the outer ears (e.g. Bomhardt, 2017), of the torso (Brinkmann et al., 2015), and probably even of headgear (Wersenyi, 2005; Wersenyi, 2017) influence the HRTFs and thus probably as well localization and other perceptual attributes. Generally speaking, spatial cues are modified by headgear, for example by wearing a baseball cap, a bicycle helmet, or a head-mounted display, which nowadays is often used in VR applications. In many real life situations, however, a good localization performance is important when wearing such items, e.g. in order to determine approaching vehicles when cycling. Furthermore, when performing psychoacoustic experiments in mixed-reality applications using head-mounted displays, the influence of the head-mounted display on the HRTFs must be considered. Effects of an HTC Vive head-mounted display on localization performance have already been shown in Ahrens et al. (2018). To analyze the influence of headgear for varying directions of incidence, measurements of HRTFs on a dense spherical sampling grid are required. However, HRTF measurements of a dummy head with various headgear are still rare, and to our knowledge only one dataset measured for an HTC Vice on a sparse grid with 64 positions is freely accessible (Ahrens, 2018). This work presents high-density measurement data of HRTFs from a Neumann KU100 and a HEAD acoustics HMS II.3 dummy head, either equipped with a bicycle helmet, a baseball cap, an Oculus Rift head-mounted display, or a set of extra-aural AKG K1000 headphones. For the measurements, we used the VariSphear measurement system (Bernschütz, 2010), allowing precise positioning of the dummy head at the spatial sampling positions. The various HRTF sets were captured on a full spherical Lebedev grid with 2702 points. In our study, we analyze the measured datasets in terms of their spectrum, their binaural cues, and regarding their localization performance based on localization models, and compare the results to reference measurements of the dummy heads without headgear. The results show that differences to the reference without headgear vary significantly depending on the type of the headgear. Regarding the ITDs and ILDs, the analysis reveals the highest influences for the AKG K1000. While for the Oculus Rift head-mounted display, the ITDs and ILDs are mainly affected for frontal directions, only a very weak influence of the bicycle helmet and the baseball cap on ITDs and ILDs was observed. For the spectral differences to the reference the results show maximal deviations for the AKG K1000, the lowest for the Oculus Rift and the baseball cap. Furthermore, we analyzed for which incidence directions the spectrum is influenced most by the headgears. For the Oculus Rift and the baseball cap, the strongest deviations were found for contralateral sound incidence. For the bicycle helmet, the directions mostly affected are as well contralateral, but shifted upwards in elevation. Finally, the AKG K1000 headphones generally has the highest influence on the measured HRTFs, which becomes maximal for sound incidence from behind. The results of this study are relevant for applications where headgears are worn and localization or other aspects of spatial hearing are considered. This could be the case, for example in mixed-reality applications where natural sound sources are presented while the listener is wearing a head-mounted display, or when investigating localization performance in certain situations, e.g. in sports activities where headgears are used. However, it is an important intention of this study to provide a freely available database of HRTF sets which is well suited for auralization purposes and which allows to further investigate the influence of headgear on auditory perception. The HRTF sets will be publicly available in the SOFA format under a Creative Commons CC BY-SA 4.0 license.


Christoph Pörschmann
TH Köln - University of Applied Sciences
Institute of Communications Engineering
Department of Acoustics and Audio Signal Processing
Betzdorfer Str. 2, D-50679 Cologne, Germany







Files (243.3 MB)

Name Size Download all
12.8 MB Preview Download
26.7 MB Preview Download
24.9 MB Preview Download
26.8 MB Preview Download
25.0 MB Preview Download
113.7 MB Preview Download
1.1 MB Preview Download
5.3 kB Download
36.3 kB Download
336.6 kB Preview Download
9.9 MB Preview Download
1.9 MB Preview Download

Additional details

Related works

Is supplement to
Conference paper: 10.25836/sasp.2019.27 (DOI)


  • Pörschmann, C., Arend, J. M., & Gillioz, R. (2019). How wearing headgear affects measured head-related transfer functions. Proceedings of the EAA Spatial Audio Signal Processing Symposium, Paris, 49–54.