Saccade Transition Datasets (COGAIN2026)
Authors/Creators
Description
About this repository
This data repository contains multiple datasets derived from an eye-tracking study involving 121 participants. The datasets contain differently encoded saccade transitions and were used in the publication "Predicting User Perception based on Stimuli-Independent Saccade Transitions" published at the ACM Symposium on Eye Tracking Research & Applications (ETRA2026). In the study, participants interacted with six different websites. Each participant was first assigned a specific task and given up to three minutes to locate the required information. After completing the task, participants were allowed to freely browse the website for the remaining time. The goal of the study was to analyze user interaction behavior and investigate how saccade transition patterns can be used to predict users' perceived usability and user experience (UX) ratings independently of the presented stimulus. Further methodological details and experimental design can be found in the associated publication (see section Conference below).
Abstract
Scanpaths and eye movements provide insight into how users perceive and interact with digital products. However, most studies assess user states using stimulus-dependent metrics, like fixations or areas of interest (AOIs). This user study examines whether stimulus-independent saccadic transitions — gaze movements not tied to predefined stimulus elements — carry predictive information about user states and experiences. To do so, eye-tracking data and perceived usability and UX ratings were obtained from 121 participants interacting with six websites. Saccadic transitions were extracted from the scanpaths and analyzed using machine learning models to identify transition patterns predictive of the user ratings. Results show that models predict perceived usability and UX most accurately when saccade transitions are grouped into the eight inter-cardinal directions and further differentiated by median saccade length. This demonstrates that even brief, often-overlooked gaze shifts within the stimulus can provide valuable insight into how users perceive websites.
Data Explanation
This repository includes two main types of files:
- encoded saccade transition datasets used for machine learning, and
- a PDF file illustrating the encoding schemes (only up until 16 directional groups for readability reasons)
- Datasets
Dataset files follow the naming convention:ETRA2026_NGRAM_{sections}_{threshold}_Linear, where
- sections are the number of directional divisions used in the encoding scheme and
- threshold refer to the percentile threshold applied to differentiate between short and long saccades
- Encoding Scheme
All four encoding schemes are based on the approach proposed by Bulling et al. (2010), which is described in detail in the associated publication (see Section Conference below) . IFollowing the encoding scheme, saccades are first categorized into a predefined number of directional groups. Within each group, saccades are further distinguished based on their length. The file "Encoding_Schemes.pdf" provides a visual overview of the four encoding variants, which differ in the number of directional sections.
-
Machine Learning Labels and Metadata
Each dataset contains the following columns:- ParticipantID: Identifier of the participant (e.g., P001)
- Stimulus: Indicates the website associated with the recorded saccade transitions. Possible values include DAV1, DAV2, DAV3, Water1, Water2, and Water3, referring to three websites each from the German Alpine Association (DAV) and German water providers.
Labels:
The labels are derived from short versions of the User Experience Questionnaire (UEQ) and AttrakDiff, which measure perceived usability and user experience. Specifically:- Pragmatic Quality (PQ) which reflects perceived usability
- Hedonic Quality (HQ) which reflects perceived user experience
These are provided in four separate columns:
- LABEL: UEQ PQ
- LABEL: UEQ HQ
- LABEL: AttrakDiff PQ
- LABEL: AttrakDiff HQ
Acknowledgments
This study was conducted as part of the EU-funded project "Digital Innovation Ostbayern (DInO)". DInO is part of the European Digital Innovation Hub (EDIH) program and is funded by
- the European Union (Project Reference 101083427) and
- the European Funds for Regional Development (EFRE) (Project Reference 20-3092.10-THD-105).
This eye-tracking study was approved by the Joint Ethics Committee of the Bavarian Universities (GEHBa) with the reference number GEHBa-202312-V-155-R.
The Tobii Pro Fusion eye-trackers were funded by the ‘German Federal Ministry of Research, Technology and Space’ (BMFTR) through the granting of the 'Bavarian State Budget' (ZD.B) (FKZ: 16-1541).
Contact Information
If you have any questions feel free to reach out to the owner of this repository by mail: fabian.engl(a)oth-regensburg.de
Files
Encoding_Schemes.pdf
Files
(29.2 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:9c8c38bf59f6056fa4ce6f2dc174d922
|
1.8 MB | Preview Download |
|
md5:2ea562d41f8fa492e60a896b887d47bf
|
1.8 MB | Preview Download |
|
md5:45984fa3aabe8d2f75b76db93ebc52c3
|
1.8 MB | Preview Download |
|
md5:49c2c228119ba51b1e961c1369acf1ca
|
2.6 MB | Preview Download |
|
md5:a6af1ba03986aaf35f422f62df92f417
|
2.7 MB | Preview Download |
|
md5:4acc3643f73a52833210b2cd3b1a8127
|
2.5 MB | Preview Download |
|
md5:a65b74c2939b4bb0c44ac1ef853e123e
|
3.4 MB | Preview Download |
|
md5:5536a72b987665d117566dc65bef5587
|
3.6 MB | Preview Download |
|
md5:58ea58a2e303b5f26abc0cf05dd3082a
|
3.4 MB | Preview Download |
|
md5:ef2aab0c060034d8f4bb5dcd7b8925be
|
431.6 kB | Preview Download |
|
md5:e8aa4cbc9ea89e818ee071456e7e7462
|
459.3 kB | Preview Download |
|
md5:39f88a6c1397a750127f4d9e64ec7d4e
|
1.3 MB | Preview Download |
|
md5:fb228f87495a13297d57d28015b170ac
|
1.1 MB | Preview Download |
|
md5:1eb75746210053d433c9b3ea8289ff70
|
1.2 MB | Preview Download |
|
md5:87338e3f09bde7aaa184650ab7a7a1f1
|
1.0 MB | Preview Download |
|
md5:b9666640ffa4afa6eb83a328edcb97e4
|
61.8 kB | Preview Download |
|
md5:eefe745558a290229735a2c5fdb4e008
|
76.8 kB | Download |
Additional details
Software
- Repository URL
- https://github.com/las3othr/COGAIN2026
- Programming language
- Python
- Development Status
- Active