Journal article Open Access

Real-time 3D Human Pose and Motion Reconstruction from Monocular RGB Videos

Anastasios Yiannakides; Andreas Aristidou; Yiorgos Chrysanthou


DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd">
  <identifier identifierType="URL">https://zenodo.org/record/3524137</identifier>
  <creators>
    <creator>
      <creatorName>Anastasios Yiannakides</creatorName>
      <affiliation>Department of Computer Science, University of Cyprus AND RISE Research Center, Nicosia, Cyprus</affiliation>
    </creator>
    <creator>
      <creatorName>Andreas Aristidou</creatorName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0001-7754-0791</nameIdentifier>
      <affiliation>Department of Computer Science, University of Cyprus AND RISE Research Center, Nicosia, Cyprus</affiliation>
    </creator>
    <creator>
      <creatorName>Yiorgos Chrysanthou</creatorName>
      <nameIdentifier nameIdentifierScheme="ORCID" schemeURI="http://orcid.org/">0000-0001-5136-8890</nameIdentifier>
      <affiliation>Department of Computer Science, University of Cyprus AND RISE Research Center, Nicosia, Cyprus</affiliation>
    </creator>
  </creators>
  <titles>
    <title>Real-time 3D Human Pose and Motion Reconstruction from Monocular RGB Videos</title>
  </titles>
  <publisher>Zenodo</publisher>
  <publicationYear>2019</publicationYear>
  <dates>
    <date dateType="Issued">2019-04-29</date>
  </dates>
  <language>en</language>
  <resourceType resourceTypeGeneral="Text">Journal article</resourceType>
  <alternateIdentifiers>
    <alternateIdentifier alternateIdentifierType="url">https://zenodo.org/record/3524137</alternateIdentifier>
  </alternateIdentifiers>
  <relatedIdentifiers>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsIdenticalTo">10.1002/cav.1887</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf">https://zenodo.org/communities/rise-teaming-cyprus</relatedIdentifier>
  </relatedIdentifiers>
  <version>Accepted pre-print</version>
  <rightsList>
    <rights rightsURI="https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode">Creative Commons Attribution Non Commercial No Derivatives 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
  </rightsList>
  <descriptions>
    <description descriptionType="Abstract">&lt;p&gt;Real-time 3D pose estimation is of high interest in interactive applications, virtual reality, activity recognition, but most importantly, in the growing gaming industry. In this work, we present a method that captures and reconstructs the 3D skeletal pose and motion articulation of multiple characters using a monocular RGB camera. Our method deals with this challenging, but useful, task by taking advantage of the recent development in deep learning that allows 2D pose estimation of multiple characters, and the&amp;nbsp;increasing availability of motion capture data. We fit 2D estimated poses, extracted from a single camera via OpenPose, with a 2D multi-view joint projections database that is associated with their 3D motion representations. We then retrieve the 3D body pose of the tracked character, ensuring throughout that the reconstructed movements are natural, satisfy the model constraints, are within a feasible set, and are temporally smooth without jitters. We demonstrate the performance of our method in several examples, including human locomotion, simultaneously capturing of multiple characters, and motion reconstruction from different camera views.&lt;/p&gt;</description>
    <description descriptionType="Other">This work has been partly supported by the project that has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 739578 (RISE – Call: H2020-WIDESPREAD-01-2016-2017-TeamingPhase2) and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development.</description>
  </descriptions>
  <fundingReferences>
    <fundingReference>
      <funderName>European Commission</funderName>
      <funderIdentifier funderIdentifierType="Crossref Funder ID">10.13039/501100000780</funderIdentifier>
      <awardNumber awardURI="info:eu-repo/grantAgreement/EC/H2020/739578/">739578</awardNumber>
      <awardTitle>Research Center on Interactive Media, Smart System and Emerging Technologies</awardTitle>
    </fundingReference>
  </fundingReferences>
</resource>
51
37
views
downloads
Views 51
Downloads 37
Data volume 155.2 MB
Unique views 48
Unique downloads 36

Share

Cite as