Abstract

BackgroundAdding buffers to networks is part of the fundamental advance in data communication. Since edge cloud computing is based on the heterogeneous collaboration network model in a federated environment, it is natural to consider buffer-aided data communication for edge cloud applications. However, the existing studies generally pursue the beneficial features of buffering at a cost of time, not to mention that many investigations are focused on lower-layer data packets rather than application-level communication transactions.AimsDriven by our argument against the claim that buffers “can introduce additional delay to the communication between the source and destination”, this research aims to investigate whether or not (and if yes, to what extent) the application-level buffering mechanism can improve the time efficiency in edge-cloud data transmissions.MethodTo collect empirical evidence for the theoretical discussion, we built up a testbed to simulate a remote health monitoring system, and conducted both experimental and modeling investigations into the first-in-first-served (FIFS) and buffer-aided data transmissions at a relay node in the system.ResultsAn empirical inequality system is established for revealing the time efficiency of buffer-aided edge cloud communication. For example, given the reference of transmitting the 11th data entity in the FIFS manner, the inequality system suggests buffering up to 50 data entities into one transmission transaction on our testbed.ConclusionsDespite the trade-off benefits (e.g., energy efficiency and fault tolerance) of buffering data, our investigation argues that the buffering mechanism can also speed up data transmission under certain circumstances, and thus it would be worth taking data buffering into account when designing and developing edge cloud applications even in the time-critical context.

Highlights

  • It has been identified that “interprocess communication is at the heart of all distributed systems” [1], while edge cloud applications are typical distributed software systems involving interprocess communication between the edge devices and the cloud

  • An empirical inequality system is established for revealing the time efficiency of buffer-aided edge cloud communication

  • Despite the trade-off benefits of buffering data, our investigation argues that the buffering mechanism can speed up data transmission under certain circumstances, and it would be worth taking data buffering into account when designing and developing edge cloud applications even in the time-critical context

Read more

Summary

Results

An empirical inequality system is established for revealing the time efficiency of buffer-aided edge cloud communication. Given the reference of transmitting the 11th data entity in the FIFS manner, the inequality system suggests buffering up to 50 data entities into one transmission transaction on our testbed

Conclusions
Introduction
Conclusions and future work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call