Enabling natural interactions with virtual humans in XR via real-time human action recognition
Description
As Extended Reality (XR) applications evolve from passive viewing experiences to interactive simulations, the behavior of virtual humans becomes a critical factor for immersion. Users expect them to react naturally to their movements, yet implementing robust action recognition and its integration into a VR application remains a challenging task. This paper introduces a modular toolkit designed to bridge this gap. By decoupling the rendering loop from the inference process, our system allows for a sophisticated deep learning-based detection of user actions (such as hand waving or crossing hands) without compromising the frame rate of the XR experience. We present the system architecture and its integration into the Unity engine, and describe the demo applications we developed.
Files
ieee_vr_2026_demo_fassold_realtime_human_action_recognition.pdf
Files
(2.8 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:4b2b370fd2c6569e63315efcffdb7e3d
|
2.8 MB | Preview Download |
Additional details
Funding
- European Commission
- Presence XR 101135025