Conference paper Open Access

Pairwise Ranking Network for Affect Recognition

Georgios Zoumpourlis; Ioannis Patras

DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="" xmlns="" xsi:schemaLocation="">
  <identifier identifierType="DOI">10.5281/zenodo.5550449</identifier>
      <creatorName>Georgios Zoumpourlis</creatorName>
      <affiliation>Queen Mary University of London</affiliation>
      <creatorName>Ioannis Patras</creatorName>
      <affiliation>Queen Mary University of London</affiliation>
    <title>Pairwise Ranking Network for Affect Recognition</title>
    <date dateType="Issued">2021-06-26</date>
  <resourceType resourceTypeGeneral="ConferencePaper"/>
    <alternateIdentifier alternateIdentifierType="url"></alternateIdentifier>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.5281/zenodo.5550448</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf"></relatedIdentifier>
    <rights rightsURI="">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
    <description descriptionType="Abstract">&lt;p&gt;In this work we study the problem of emotion recognition under the prism of preference learning. Affective datasets are typically annotated by assigning a single absolute label, i.e. a numerical value that describes the intensity of an emotional attribute, to each sample. Then, the majority of existing works on affect recognition employ sample-wise classification/regression methods to predict affective states, using those annotations. We take a different approach and use a deep network architecture that performs joint training on the tasks of classification/regression of samples and ordinal ranking between pairs of samples. By treating input samples in a pairwise manner, we leverage the auxiliary task of inferring the ordinal relation between their corresponding affective states. Incorporating the ranking objective allows capturing the inherently ordinal structure of emotions and learning the inter-sample relations, resulting in better generalization. Our method is incorporated into existing affect recognition architectures and evaluated on datasets of electroencephalograms (EEG) and images. We show that the approach proposed in this work leads to consistent performance gains when incorporated in classification/regression networks.&lt;/p&gt;</description>
      <funderName>European Commission</funderName>
      <funderIdentifier funderIdentifierType="Crossref Funder ID">10.13039/501100000780</funderIdentifier>
      <awardNumber awardURI="info:eu-repo/grantAgreement/EC/Horizon 2020 Framework Programme - Research and Innovation action/951911/">951911</awardNumber>
      <awardTitle>A European Excellence Centre for Media, Society and Democracy</awardTitle>
All versions This version
Views 4545
Downloads 3030
Data volume 9.4 MB9.4 MB
Unique views 3939
Unique downloads 2727


Cite as