Zenodo.org will be unavailable for 2 hours on September 29th from 06:00-08:00 UTC. See announcement.

Conference paper Open Access

How you type is what you type: Keystroke dynamics correlate with affective content

López-Carral, Héctor; Santos-Pata, Diogo; Zucca, Riccardo; Verschure, Paul F.M.J.

Estimating the affective state of a user during a computer task traditionally relies on either subjective reports or analysis of physiological signals, facial expressions, and other measures. These methods have known limitations, can be intrusive and may require specialized equipment. An alternative would be employing a ubiquitous device of everyday use such as a standard keyboard. Here we investigate if we can infer the emotional state of a user by analyzing their typing patterns. To test this hypothesis, we asked 400 participants to caption a set of emotionally charged images taken from a standard database with known ratings of arousal and valence. We computed different keystroke pattern dynamics, including keystroke duration (dwell time) and latency (flight time). By computing the mean value of all of these features for each image, we found a statistically significant negative correlation between dwell times and valence, and between flight times and arousal. These results highlight the potential of using keystroke dynamics to estimate the affective state of a user in a non-obtrusive way and without the need for specialized devices.

Files (1.1 MB)
Name Size
10.1109@ACII.2019.8925460.pdf
md5:ab6b38e83403a29fe13cac1914c4e3c1
1.1 MB Download
108
104
views
downloads
Views 108
Downloads 104
Data volume 113.4 MB
Unique views 99
Unique downloads 100

Share

Cite as