Published September 22, 2025 | Version v1
Conference paper Open

TinyKubeML: Orchestrating TinyML Models on Far-Edge Clusters

  • 1. Fraunhofer AICOS
  • 2. ROR icon Fraunhofer Portugal Research
  • 3. ROR icon University of Aveiro
  • 4. Universidade do Porto Porto Business School
  • 5. Fraunhofer Portugal
  • 6. Universidade do Porto Faculdade de Engenharia

Description

The Internet of Things (IoT) is rapidly materializing, but the growing volume of data generated by Far-Edge devices, often microcontroller-based, poses challenges for cloud-centric processing. TinyML addresses this challenge by enabling on-device ML inference, thereby reducing communication latency and cost. However, current solu-
tions largely overlook deployment and management challenges, especially in heterogeneous, resource-constrained environments. This paper introduces TinyKubeML, a Kubernetes-based framework that enables resource-aware deployment of TinyML models on Far-Edge clusters. It abstracts device heterogeneity and automates model
partitioning, artifact generation, and deployment using a custom Kubernetes Operator. TinyKubeML supports distributed inference and includes recovery mechanisms to ensure service continuity. Our evaluation shows that TinyKubeML can deploy distributed models efficiently with minimal impact on accuracy, while supporting automatic recovery in the case of device failures, demonstrating its potential to bridge the gap between scalable orchestration and TinyML deployment in IoT scenarios.

Files

TinyKubeML-final-CR-v2.pdf

Files (1.4 MB)

Name Size Download all
md5:7d0a328c4f83dcc35b3cfc0f613649c9
1.4 MB Preview Download

Additional details

Funding

European Commission
MLSysOps - Machine Learning for Autonomic System Operation in the Heterogeneous Edge-Cloud Continuum 101092912