Published October 23, 2024 | Version v1
Project deliverable Open

THEIA-XR: Interaction Sequence Models (First Version) (D4.3)

Description

In Task T4.3, the findings of user research and concepts developed within the transdisciplinary co-design approach (WP3) are used to create technical models for a cyber-physical system (CPS) and its human-machine interface (HMI). The process involves translating visions and planned interaction flows into CPS specifications. The presented interaction sequence models, based on Norman's Human Action Cycle, focus on information retrieval and the integration of XR technology to enhance vehicle operators' awareness and performance. The models emphasize improving information design and retrieval processes, moving away from traditional small screens. The conceptual models represent early solutions, pending validation in co-design workshops. 

For UC1, snow grooming, three key scenarios are discussed, addressing challenges such as diverting the operator's attention during adjustments, constant checks for obstacles affecting attention, and the safety risks associated with reduced visibility in adverse weather conditions. The proposed visionary solutions involve real-time projection of information onto the snow surface, visualizing obstacles with lights, and projecting map data onto the windshield to enhance navigation in challenging conditions. 

For UC2, logistics, three key scenarios were modelled: Picking up containers, placing containers, and collision avoidance during transport. Challenges include visually estimating precise spreader positioning and frequent manual checks for obstacles. Proposed solutions involve using XR projections to aid spreader positioning and employing thermal cameras to detect and highlight objects in the operator's field of view. 

For UC3, construction, three key scenarios are discussed. These scenarios include achieving precise depth control during grading, obstacle and person detection, and streamlining the infield design process for excavation pits. Challenges included constant depth checking during grading, limited visibility behind the boom, and a time-consuming calibration process in infield design. Proposed solutions involve visualizing bucket depth with LED or laser projection, using ambient lights for better detection, and implementing LiDAR in combination with laser projections for accurate excavation pit outlines. 

In the final chapter, two XR interface design approaches currently in development for controlling off-highway machinery are presented: One utilizing Lidar sensors and the Varjo XR-3, and another employing 360° cameras with augmented content. Furthermore, two roles of haptic feedback — informative and assistive — are discussed in optimizing operator tasks and possible applications in all use-cases are outlined. 

The interaction sequence models provide a formalized framework for describing XR interaction within machine operation tasks, considering human perception and processing capabilities. These models serve as a tool for multi-disciplinary discussions. This foundational work lays the groundwork for subsequent technical development and integration within the project. 

Files

THEIA-XR_Deliverable D4.3.pdf

Files (1.9 MB)

Name Size Download all
md5:dc1fe2ade4c5bb2c4533a7e49df92c55
1.9 MB Preview Download

Additional details

Funding

European Commission
THEIA-XR – Making The Invisible Visible for Off-Highway Machinery by Conveying Extended Reality Technologies 101092861