Abstract

Edge microservice applications are becoming a viable solution for the execution of real-time IoT analytics, due to their rapid response and reduced latency. With Edge Computing, unlike the central Cloud, the amount of available resource is constrained and the computation that can be undertaken is also limited. Microservices are not standalone, they are devised as a set of cooperating tasks that are fed data over the network through specific APIs. The cost of processing these feeds of data in real-time, especially for massive IoT configurations, is however generally overlooked. In this work we evaluate the cost of dealing with thousands of sensors sending data to the edge with the commonly used encoding of JSON over REST interfaces, and compare this to other mechanisms that use binary encodings as well as streaming interfaces. The choice has a big impact on the microservice implementation, as a wrong selection can lead to excessive resource consumption, because using a less efficient encoding and transport mechanism results in much higher resource requirements, even to do an identical job.

Highlights

  • The Internet of Things (IoT) is a well-known paradigm that envisions the interconnection and the exchange of data between many of the physical objects that surround us [13]

  • Different studies have been conducted in different areas, we argue that a systematic unified evaluation that considers the impact of data encodings and transport mechanisms end-to-end, from the IoT devices to the edge processing functions is missing

  • We provide the results of several experiments performed to evaluate the cost of various encoding and transport mechanisms, utilised by edge applications and APIs, while dealing with the massive data generated by thousands of IoT sensors

Read more

Summary

Introduction

The Internet of Things (IoT) is a well-known paradigm that envisions the interconnection and the exchange of data between many of the physical objects that surround us [13]. The concepts of edge and fog can dramatically reduce the latency by bringing the computing nodes closer to the data sources [41] Whilst this approach perfectly matches the demands for rapid processing of large amounts of data of some emerging use cases, such as smart health, co-operative intelligent transport systems and Industry 4.0 [38], it introduces a new set of challenges related to the end-to-end orchestration of resources, i.e., from the “things” to the cloud. When microservice architectures are considered, it is fundamental ensuring adequate quality aspects for their APIs in terms

Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.