GSET Somi: A Game-Specific Eye Tracking Dataset for Somi

Author: University of Ottawa

Partner: No

Contact: In case of any problems or questions, please send an email to Shervin Shirmohammadi (shervin@ieee.org).

Tags:

Categories:

Updated:

Total: 135

SRC: 135

Ratings: 84

Resolution: HD

Method: Custom

Description

An eye tracking dataset of computer game players who played the side-scrolling cloud game Somi. The game was streamed in the form of video from the cloud to the player. This dataset can be used for designing and testing game-specific visual attention models. The source code of the game is also available to facilitate further modifications and adjustments. For collecting this data, male and female candidates were asked to play the game in front of a remote eye-tracking device. For each player, we recorded gaze points, video frames of the gameplay, and mouse and keyboard commands. For each video frame, a list of its game objects with their locations and sizes was also recorded. This data, synchronized with eye-tracking data, allows one to calculate the amount of attention that each object or group of objects draw from each player. As a benchmark, we also show various attention patterns could be identified among players.

Access

Please note that the complete dataset is about 880 Gig, although you can download individual sessions (about 5 Gig each) separately. If you choose to download the dataset, you can do so from here: http://www.discover.uottawa.ca/images/files/external/GSET/Somi/

License

The testbed here is free to use as long as: 1- It is used for non-commercial purposes, AND 2- In the resulting publication (if any), reference is given to the dataset by citing the following paper [ATM16]

References and Citation

If you use the Ultra-Eye dataset in your research, we kindly ask you to reference the following paper [ATM16].

References

  • ATM16: Ahmadi, H., Tootaghaj, S.Z., Mowlaei, S., Hashemi, M.R., Shirmohammadi, S.
  • GSET: Somi: A game-specific eye tracking dataset for Somi, Proceedings of the 7th International Conference on Multimedia Systems, MMSys 2016.