TU Delft i_QOE
Description
Recently, a lot of effort has been devoted to estimating users’ Quality of Experience (QoE) in order to optimize video delivery. So far, subjective (and objective, as a consequence) QoE assessment has been mostly associated to the perceptual quality of the video (i.e., asking users to self-report their perceptual satisfaction with respect to a set of multimedia contents), neglecting other QoE aspects, such as enjoyment or endurability. In addition, it is common practice in QoE research to target the average of user opinion scores gathered from subjective tests (also known as Mean Opinion Score, MOS). This approach does not take into account the dependency of QoE on individual characteristic, such as gender effects or personality, which are also known to influence user’s QoE. As a result, there is a lack of publicly available QoE datasets (1) targeting other aspects of QoE besides perceptual quality and (2) providing individual subjective QoE ratings as well as corresponding individual user characteristics (e.g., gender, personality). We present here a dataset to address the above issues. The i_QoE dataset can be used for analyzing individual Quality of Experience in its many aspects, beyond perceptual quality only. The i_QoE dataset is based on an empirical study. For a full description of the experiment and its setup, please check the paper referenced above. Six different videos covering three genres (i.e., comedy, sports, education) were further encoded at two bitrate levels (i.e., 600kbps and 2000kbps) to enforce different perceived quality levels. Sixty participants were involved in this study. They were split into two disjoint groups: the 30 participants of one group watched all videos by themselves, whereas the 30 participants in the second group watched all videos with two friends (so, in groups of three; interaction among them was allowed). Before starting the experiment, participants were asked to fill in some questionnaires investigating personal information, such as the level of interest they had (a priori) in the video genres they were about to see, their immersive tendency, nationality, gender as well as their personality. After filling in the personal data questionnaire, participants watched one of the two versions (i.e., 600 or 2000 kbps) of each video, for all six videos. Bitrate levels as well as video order were counterbalanced across participants to guarantee that each participant would witness a similar range of perceived quality, and to avoid fatigue and learning effects. For each video, participants scored their QoE in terms of enjoyment, endurability, satisfaction, involvement and perceived visual quality via a QoE questionnaire. All aspects (excluding Perceived Quality) were quantified by means of 4 questions each to be answered on a 7-point likert scale. Perceived video quality was instead measured through 2 questions (i.e., one for the annoyance of artifacts, and another one for the overall video quality) to be rated on a 5-point scale, according to the ITU-R BT.500 (2002) specification.
Access
The database can be downloaded at the following link. The files are password protected. To get the password you can contact Yi Zhu (Y.Zhu-1@tudelft.nl) Link: http://ii.tudelft.nl/iqlab/databases/i_QoE.zip
References and Citation
We kindly request you to cite the following paper [YHR15] in any published work if you use this dataset.
References
YHR15
: Zhu, Yi, Ingrid Heynderickx, and Judith A. Redi. "Understanding the role of social context and user factors in video Quality of Experience." Computers in Human Behavior 49 (2015): 412-426.