Subjective Test Dataset and Meta-data-based Models for 360° Streaming Video Quality
- 1. Technische Universität Ilmenau
- 2. Huawei Technologies Co. Ltd.
Description
During the last years, the number of 360° videos available for streaming has rapidly increased, leading to the
need for 360° streaming video quality assessment. In this paper, we report and publish results of three subjective 360° video
quality tests, with conditions used to reflect real-world bitrates and resolutions including 4K, 6K and 8K, resulting in 64 stimuli
each for the first two tests and 63 for the third. As playout device we used the HTC Vive for the first and HTC Vive Pro
for the remaining two tests. Video-quality ratings were collected using the 5-point Absolute Category Rating scale. The 360°
dataset provided with the paper contains the links of the used source videos, the raw subjective scores, video-related meta-data,
head rotation data and Simulator Sickness Questionnaire results per stimulus and per subject to enable reproducibility of the
provided results. Moreover, we use our dataset to compare the performance of state-of-the-art full-reference quality metrics such
as VMAF, PSNR, SSIM, ADM2, WS-PSNR and WS-SSIM. Out of all metrics, VMAF was found to show the highest correlation
with the subjective scores. Further, we evaluated a center-cropped version of VMAF ("VMAF-cc") that showed to provide a similar
performance as the full VMAF. In addition to the dataset and the objective metric evaluation, we propose two new video-quality
prediction models, a bitstream meta-data-based model and a hybrid no-reference model using bitrate, resolution and pixel
information of the video as input. The new lightweight models provide similar performance as the full-reference models while
enabling fast calculations.
Files
360_streaming_video_quality_dataset.zip
Files
(534.3 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:5de1046aabf589f0ca3a18a9974d7f62
|
534.3 MB | Preview Download |