Conference paper Open Access

# How you type is what you type: Keystroke dynamics correlate with affective content

López-Carral, Héctor; Santos-Pata, Diogo; Zucca, Riccardo; Verschure, Paul F.M.J.

### Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
<dc:creator>López-Carral, Héctor</dc:creator>
<dc:creator>Santos-Pata, Diogo</dc:creator>
<dc:creator>Zucca, Riccardo</dc:creator>
<dc:creator>Verschure, Paul F.M.J.</dc:creator>
<dc:date>2019-12-09</dc:date>
<dc:description>Estimating the affective state of a user during a computer task traditionally relies on either subjective reports or analysis of physiological signals, facial expressions, and other measures. These methods have known limitations, can be intrusive and may require specialized equipment. An alternative would be employing a ubiquitous device of everyday use such as a standard keyboard. Here we investigate if we can infer the emotional state of a user by analyzing their typing patterns. To test this hypothesis, we asked 400 participants to caption a set of emotionally charged images taken from a standard database with known ratings of arousal and valence. We computed different keystroke pattern dynamics, including keystroke duration (dwell time) and latency (flight time). By computing the mean value of all of these features for each image, we found a statistically significant negative correlation between dwell times and valence, and between flight times and arousal. These results highlight the potential of using keystroke dynamics to estimate the affective state of a user in a non-obtrusive way and without the need for specialized devices.</dc:description>
<dc:identifier>https://zenodo.org/record/4534138</dc:identifier>
<dc:identifier>10.1109/ACII.2019.8925460</dc:identifier>
<dc:identifier>oai:zenodo.org:4534138</dc:identifier>
<dc:language>eng</dc:language>
<dc:relation>info:eu-repo/grantAgreement/EC/H2020/787061/</dc:relation>
<dc:relation>url:https://zenodo.org/communities/787061</dc:relation>
<dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
<dc:rights>https://creativecommons.org/licenses/by/4.0/legalcode</dc:rights>
<dc:subject>keystroke</dc:subject>
<dc:subject>keyboard</dc:subject>
<dc:subject>typing</dc:subject>
<dc:subject>arousal</dc:subject>
<dc:subject>valence</dc:subject>
<dc:subject>affect</dc:subject>
<dc:title>How you type is what you type: Keystroke dynamics correlate with affective content</dc:title>
<dc:type>info:eu-repo/semantics/conferencePaper</dc:type>
<dc:type>publication-conferencepaper</dc:type>
</oai_dc:dc>

31
24
views
downloads
 Views 31 Downloads 24 Data volume 26.2 MB Unique views 25 Unique downloads 23