Conference paper Open Access

# How you type is what you type: Keystroke dynamics correlate with affective content

López-Carral, Héctor; Santos-Pata, Diogo; Zucca, Riccardo; Verschure, Paul F.M.J.

### DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<identifier identifierType="URL">https://zenodo.org/record/4534138</identifier>
<creators>
<creator>
<creatorName>López-Carral, Héctor</creatorName>
<givenName>Héctor</givenName>
<familyName>López-Carral</familyName>
<nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0002-4423-7179</nameIdentifier>
<affiliation>SPECS-IBEC</affiliation>
</creator>
<creator>
<creatorName>Santos-Pata, Diogo</creatorName>
<givenName>Diogo</givenName>
<familyName>Santos-Pata</familyName>
<affiliation>SPECS-IBEC</affiliation>
</creator>
<creator>
<creatorName>Zucca, Riccardo</creatorName>
<givenName>Riccardo</givenName>
<familyName>Zucca</familyName>
<affiliation>SPECS-IBEC</affiliation>
</creator>
<creator>
<creatorName>Verschure, Paul F.M.J.</creatorName>
<givenName>Paul F.M.J.</givenName>
<familyName>Verschure</familyName>
<affiliation>SPECS-IBEC</affiliation>
</creator>
</creators>
<titles>
<title>How you type is what you type: Keystroke dynamics correlate with affective content</title>
</titles>
<publisher>Zenodo</publisher>
<publicationYear>2019</publicationYear>
<subjects>
<subject>keystroke</subject>
<subject>keyboard</subject>
<subject>typing</subject>
<subject>arousal</subject>
<subject>valence</subject>
<subject>affect</subject>
</subjects>
<dates>
<date dateType="Issued">2019-12-09</date>
</dates>
<language>en</language>
<resourceType resourceTypeGeneral="ConferencePaper"/>
<alternateIdentifiers>
<alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/4534138</alternateIdentifier>
</alternateIdentifiers>
<relatedIdentifiers>
<relatedIdentifier relatedIdentifierType="DOI" relationType="IsIdenticalTo">10.1109/ACII.2019.8925460</relatedIdentifier>
<relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf">https://zenodo.org/communities/787061</relatedIdentifier>
</relatedIdentifiers>
<rightsList>
<rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
</rightsList>
<descriptions>
<description descriptionType="Abstract">&lt;p&gt;Estimating the affective state of a user during a computer task traditionally relies on either subjective reports or analysis of physiological signals, facial expressions, and other measures. These methods have known limitations, can be intrusive and may require specialized equipment. An alternative would be employing a ubiquitous device of everyday use such as a standard keyboard. Here we investigate if we can infer the emotional state of a user by analyzing their typing patterns. To test this hypothesis, we asked 400 participants to caption a set of emotionally charged images taken from a standard database with known ratings of arousal and valence. We computed different keystroke pattern dynamics, including keystroke duration (dwell time) and latency (flight time). By computing the mean value of all of these features for each image, we found a statistically significant negative correlation between dwell times and valence, and between flight times and arousal. These results highlight the potential of using keystroke dynamics to estimate the affective state of a user in a non-obtrusive way and without the need for specialized devices.&lt;/p&gt;</description>
</descriptions>
<fundingReferences>
<fundingReference>
<funderName>European Commission</funderName>
<funderIdentifier funderIdentifierType="Crossref Funder ID">10.13039/501100000780</funderIdentifier>
<awardNumber awardURI="info:eu-repo/grantAgreement/EC/H2020/787061/">787061</awardNumber>
<awardTitle>Advanced tools for fighting oNline Illegal TrAfficking</awardTitle>
</fundingReference>
</fundingReferences>
</resource>

38
37
views