Conference paper Open Access

How you type is what you type: Keystroke dynamics correlate with affective content

López-Carral, Héctor; Santos-Pata, Diogo; Zucca, Riccardo; Verschure, Paul F.M.J.


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nam##2200000uu#4500</leader>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">keystroke</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">keyboard</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">typing</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">arousal</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">valence</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">affect</subfield>
  </datafield>
  <controlfield tag="005">20210212153052.0</controlfield>
  <controlfield tag="001">4534138</controlfield>
  <datafield tag="711" ind1=" " ind2=" ">
    <subfield code="d">3-6 September 2019</subfield>
    <subfield code="g">ACII</subfield>
    <subfield code="a">2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII)</subfield>
    <subfield code="c">Cambridge, United Kingdom</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">SPECS-IBEC</subfield>
    <subfield code="a">Santos-Pata, Diogo</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">SPECS-IBEC</subfield>
    <subfield code="a">Zucca, Riccardo</subfield>
  </datafield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">SPECS-IBEC</subfield>
    <subfield code="a">Verschure, Paul F.M.J.</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">1090488</subfield>
    <subfield code="z">md5:ab6b38e83403a29fe13cac1914c4e3c1</subfield>
    <subfield code="u">https://zenodo.org/record/4534138/files/10.1109@ACII.2019.8925460.pdf</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="y">Conference website</subfield>
    <subfield code="u">http://acii-conf.org/2019/</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2019-12-09</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">openaire</subfield>
    <subfield code="p">user-787061</subfield>
    <subfield code="o">oai:zenodo.org:4534138</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">SPECS-IBEC</subfield>
    <subfield code="0">(orcid)0000-0002-4423-7179</subfield>
    <subfield code="a">López-Carral, Héctor</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">How you type is what you type: Keystroke dynamics correlate with affective content</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">user-787061</subfield>
  </datafield>
  <datafield tag="536" ind1=" " ind2=" ">
    <subfield code="c">787061</subfield>
    <subfield code="a">Advanced tools for fighting oNline Illegal TrAfficking</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">https://creativecommons.org/licenses/by/4.0/legalcode</subfield>
    <subfield code="a">Creative Commons Attribution 4.0 International</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;Estimating the affective state of a user during a computer task traditionally relies on either subjective reports or analysis of physiological signals, facial expressions, and other measures. These methods have known limitations, can be intrusive and may require specialized equipment. An alternative would be employing a ubiquitous device of everyday use such as a standard keyboard. Here we investigate if we can infer the emotional state of a user by analyzing their typing patterns. To test this hypothesis, we asked 400 participants to caption a set of emotionally charged images taken from a standard database with known ratings of arousal and valence. We computed different keystroke pattern dynamics, including keystroke duration (dwell time) and latency (flight time). By computing the mean value of all of these features for each image, we found a statistically significant negative correlation between dwell times and valence, and between flight times and arousal. These results highlight the potential of using keystroke dynamics to estimate the affective state of a user in a non-obtrusive way and without the need for specialized devices.&lt;/p&gt;</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.1109/ACII.2019.8925460</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">publication</subfield>
    <subfield code="b">conferencepaper</subfield>
  </datafield>
</record>
38
37
views
downloads
Views 38
Downloads 37
Data volume 40.3 MB
Unique views 32
Unique downloads 36

Share

Cite as