Conference paper Open Access

# Pairwise Ranking Network for Affect Recognition

Georgios Zoumpourlis; Ioannis Patras

### Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd">
<dc:creator>Georgios Zoumpourlis</dc:creator>
<dc:creator>Ioannis Patras</dc:creator>
<dc:date>2021-06-26</dc:date>
<dc:description>In this work we study the problem of emotion recognition under the prism of preference learning. Affective datasets are typically annotated by assigning a single absolute label, i.e. a numerical value that describes the intensity of an emotional attribute, to each sample. Then, the majority of existing works on affect recognition employ sample-wise classification/regression methods to predict affective states, using those annotations. We take a different approach and use a deep network architecture that performs joint training on the tasks of classification/regression of samples and ordinal ranking between pairs of samples. By treating input samples in a pairwise manner, we leverage the auxiliary task of inferring the ordinal relation between their corresponding affective states. Incorporating the ranking objective allows capturing the inherently ordinal structure of emotions and learning the inter-sample relations, resulting in better generalization. Our method is incorporated into existing affect recognition architectures and evaluated on datasets of electroencephalograms (EEG) and images. We show that the approach proposed in this work leads to consistent performance gains when incorporated in classification/regression networks.</dc:description>
<dc:identifier>https://zenodo.org/record/5550449</dc:identifier>
<dc:identifier>10.5281/zenodo.5550449</dc:identifier>
<dc:identifier>oai:zenodo.org:5550449</dc:identifier>
<dc:relation>info:eu-repo/grantAgreement/EC/Horizon 2020 Framework Programme - Research and Innovation action/951911/</dc:relation>
<dc:relation>doi:10.5281/zenodo.5550448</dc:relation>
<dc:relation>url:https://zenodo.org/communities/ai4media</dc:relation>
<dc:rights>info:eu-repo/semantics/openAccess</dc:rights>
<dc:title>Pairwise Ranking Network for Affect Recognition</dc:title>
<dc:type>info:eu-repo/semantics/conferencePaper</dc:type>
<dc:type>publication-conferencepaper</dc:type>
</oai_dc:dc>

45
30
views