Enhanced real-time motion transfer to 3D avatars using RGB-based human 3D pose estimation
Creators
Description
Human motion transfer on 3D avatars has witnessed substantial progress, driven by the advancements of 3D pose estimation using
RGB data. This technology analyzes human movements captured through RGB cameras, enabling tracking of 3D body landmarks and
leading to the animation of 3D avatars. Utilizing RGB input offers a range of advantages, democratizing avatar creation by eliminating
the need for specialized equipment, such as sensors, markers, or specialized studios. Recent years have seen remarkable strides in this
field, leveraging deep learning models and sophisticated computer vision algorithms to capture intricate movements and gestures
from RGB video footage. This study introduces a novel real-time approach leveraging RGB input to generate realistic 3D animations. It
comprises three phases: i) 3D human pose estimation using MediaPipe, ii) correction of MediaPipe’s landmarks’ inaccuracies, especially
regarding depth dimension, and incorporation of bones’ rotation information, and, finally, iii) transfer of the motion to the target 3D
avatar.
© Ilias Poulios | ACM 2024. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM Digital Library, https://doi.org/10.1145/3672406.3672427.
Files
Enhanced real-time motion transfer to 3D avatars using RGB-based human 3D pose estimation ~ Authors version Zenodo uploaded.pdf
Files
(18.3 MB)
Name | Size | Download all |
---|---|---|
md5:ce230c04e75795932cb149c7f341779c
|
18.3 MB | Preview Download |