Published December 18, 2025 | Version v1
Report Open

Model inference over 5G networks for robotics

  • 1. ROR icon Centro Tecnológico de Investigación, Desarrollo e Innovación en tecnologías de la Información y las Comunicaciones (TIC)

Contributors

  • 1. Instituto Tecnológico de Informática

Description

This work studies compute offloading for mobile-robotics perception by abstracting CNN inference from the robot to edge and remote servers within a ROS 2 pipeline. The OMZ person-detection-0200 model is executed using OpenVINO, with video streamed via GStreamer over a private 5G network. Three deployments are compared: on-board inference, edge inference, and remote server inference. Average inference time decreases from 5.177 ms (on-board) to 4.998 ms (edge) and 1.291 ms (remote), while recommended throughput increases from 35.088 FPS to 41.858 FPS and 83.612 FPS. Results confirm that compute placement strongly impacts both latency and achievable frame rate, and that tail behaviour must be considered alongside central tendency when selecting an execution locus.

Files

Model inference over 5G networks for robotics.pdf

Files (759.2 kB)

Name Size Download all
md5:bb8e2ac30a3d864d5f74ac216be5936d
759.2 kB Preview Download

Additional details

Funding

European Commission
AI4EUROPE - AN AI ON-DEMAND PLATFORM TO SUPPORT RESEARCH EXCELLENCE IN EUROPE 101070000