A flexible toolkit for real-time action recognition of virtual humans in XR/AR environments
Description
Real-time action recognition for virtual humans in XR/AR scenes is essential for creating natural and interactive experiences, allowing avatars or holoported humans to respond dynamically to user movements and gestures. This technology enhances applications such as immersive gaming, virtual training simulations and remote collaboration by enabling realistic character interactions and intuitive control. It is also valuable in healthcare and rehabilitation, where motion tracking can be used for physical therapy exercises and progress monitoring. Recognizing the need for robust action recognition in XR environments, we therefore present a flexible toolkit for realtime action recognition of virtual humans in XR environments. Our toolkit is built using Unity due to its widespread adoption and adaptability in XR/AR application development. The action recognition component leverages cutting-edge deep learning models to ensure high accuracy and performance and runs the analysis on a dedicated processing PC in order to not disrupt the smoothness of the XR experience. Preliminary experiments and evaluations show that the toolkit is able to recognize a variety of actions (like raising the hand, waving the hand or jumping) in a robust way.
Files
EAIM_2025_submission_1_a_toolkit_for_real_time_action_recognition_in_xr_fassold.pdf
Files
(1.5 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:cf2026ddd20c5dd26660819328f0fdb9
|
1.5 MB | Preview Download |
Additional details
Funding
- European Commission
- A toolset for hyper-realistic and XR-based human-human and human-machine interactions 101135025