Journal article Open Access

Structured Learning and Prediction in Facial Emotion Classification and Recognition

Khalid Ounachad; Mohamed Oualla; Abdelghani Souhar; Abdelalim Sadiq


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="URL">https://zenodo.org/record/5552185</identifier>
  <creators>
    <creator>
      <creatorName>Khalid Ounachad</creatorName>
      <affiliation>Ibn Tofail University, Faculty of sciences, Kenitra, Morocco. khalid</affiliation>
    </creator>
    <creator>
      <creatorName>Mohamed Oualla</creatorName>
      <affiliation>Faculty of sciences and technology, Moulay Ismail University, Errachidia, Morocco.</affiliation>
    </creator>
    <creator>
      <creatorName>Abdelghani Souhar</creatorName>
      <affiliation>Ibn Tofail University, Faculty of sciences, Kenitra, Morocco. khalid</affiliation>
    </creator>
    <creator>
      <creatorName>Abdelalim Sadiq</creatorName>
      <affiliation>Ibn Tofail University, Faculty of sciences, Kenitra, Morocco. khalid</affiliation>
    </creator>
  </creators>
  <titles>
    <title>Structured Learning and Prediction in Facial  Emotion Classification and Recognition</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2020</publicationYear>
  <subjects>
    <subject>Supervised learning, Structured Learning, prediction, Facial Emotion Recognition, Perfect Face Ratios, Emotional Facial Expression, WSEFEP dataset.</subject>
    <subject subjectScheme="issn">2249-8958</subject>
    <subject subjectScheme="handle">C6421029320/2020©BEIESP</subject>
  </subjects>
  <contributors>
    <contributor contributorType="Sponsor">
      <contributorName>Blue Eyes Intelligence Engineering and Sciences Publication(BEIESP)</contributorName>
      <affiliation>Publisher</affiliation>
    </contributor>
  </contributors>
  <dates>
    <date dateType="Issued">2020-04-30</date>
  </dates>
  <language>en</language>
  <resourceType resourceTypeGeneral="JournalArticle"/>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/5552185</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="ISSN" relationType="IsCitedBy" resourceTypeGeneral="JournalArticle">2249-8958</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsIdenticalTo">10.35940/ijeat.C6421.049420</relatedIdentifier>
  </relatedIdentifiers>
  <rightsList>
    <rights rightsURI="https://creativecommons.org/licenses/by/4.0/legalcode">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;Structured prediction methods have become, in recent years, an attractive tool for many machine-learning applications especially in the image processing area as in customers satisfaction prediction by using facial recognition systems, in criminal investigations based on face sketches recognition, in aid to autistic children and so. The main objective of this paper is the identification of the emotion of the human being, based on their facial expressions, by applying structured learning and perfect face ratios. The basic idea of our approach is to extract the perfect face ratios from a facial emotion image as the features, this face emotional images are labeled with their kind of emotions (the seven emotions defined in literature). For this end, first we determined sixty-eight landmarks point of image faces, next we applied a new deep geometric descriptor to calculate sixteen features representing the emotional face. The training and the testing tasks are applied to the Warsaw dataset: The Set of Emotional Facial Expression Pictures (WSEFEP) dataset. Our proposed approach can be also applied in others competitor facial emotion datasets. Based on experiments, the evaluation demonstrates the satisfactory performance of our applied method, the recognition rate reaches more than 97% for all seven emotions studied and it exceeds 99.20% for neutral facial images.&lt;/p&gt;</description>
  </descriptions>
</resource>
31
11
views
downloads
Views 31
Downloads 11
Data volume 15.4 MB
Unique views 31
Unique downloads 11

Share

Cite as