Published March 1, 2017 | Version v1
Conference paper Open

Large-scale offloading in the Internet of Things

Description

Large-scale deployments of IoT devices are subject to energy and performance issues. Fortunately, offloading is a promising technique to enhance those aspects. However, several problems still remain open regarding cloud deployment and provisioning. In this paper, we address the problem of provi- sioning offloading as a service in large-scale IoT deployments. We design and develop an AutoScaler, an essential component for our offloading architecture to handle offloading workload. In addition, we also develop an offloading simulator to generate dynamic offloading workload of multiple devices. With this toolkit, we study the effect of task acceleration in different cloud servers and analyze the capacity of several cloud servers to handle multiple concurrent requests. We conduct multiple experiments in a real testbed to evaluate the system and present our experiences and lessons learned. From the results, we find that the AutoScaler component introduces a very small overhead of ≈150 milliseconds in the total response time of a request, which is a fair price to pay to empower the offloading architectures with multi-tenancy ability and dynamic horizontal scaling for IoT scenarios.

Files

article.pdf

Files (692.9 kB)

Name Size Download all
md5:4a073c550e73f917eecb5e02bdbeb0ef
692.9 kB Preview Download