Abstract

The deployment of virtual network functions (VNFs) at edge servers potentially impairs the performance of latency-sensitive applications due to their computational cost. This work considers a new approach to addressing this problem that provides line rate acceleration of VNFs by employing field-programmable gate array (FPGA) equipped edge servers. This approach has been validated by practical use cases with both TCP and UDP as underlying protocol on a physical testbed environment. We examine the performance implications of executing a security VNF at an FPGA-equipped edge server. We experimentally demonstrate reduced VNF execution latency and energy consumption for a real-time video streaming application in comparison with a software-only baseline. In particular, the results show that the approach lowers VNF execution latency and power consumption at the edge by up to 44% and 76%, respectively, in our experiments while satisfying time constraints and maintaining confidentiality with high scalability. We also highlight the potential research challenges to make this approach viable in practice.

Highlights

  • With the exponential increase in the number of devices connected to the Internet of Things (IoT), the fog and edge [1] computing paradigms are gaining momentum, for example, in 5G technology specifications [2, 3]

  • The experiments replicate a setup of an end-to-end video stream from IoT devices to a remote cloud through an field-programmable gate array (FPGA)-enabled edge environment

  • The purpose of enabling encryption virtual network functions (VNFs) at the edge is to provide the applications/devices with privacy, confidentiality and trust, while data are in transit through public networks

Read more

Summary

Introduction

With the exponential increase in the number of devices connected to the Internet of Things (IoT), the fog and edge [1] computing paradigms are gaining momentum, for example, in 5G technology specifications [2, 3]. New functions (services) with challenging performance requirements for latency and security that cannot be met by current cloud solutions running on remote datacenter sites [4] New applications, such as the Tactile Internet, real-time autonomic control, augmented reality, Industry 4.0, e-Health, smart cities/grids, or autonomous vehicle platooning, are key representative use cases to be supported by these emerging IoT architectures [5]. Fog and edge Computing have been proposed to address challenging requirements of latency and security in different application contexts, but all share the common idea of reducing the dependency on the remote cloud. The key advantage of FPGA platforms over application-specific integrated circuit (ASIC) is programming and design flexibility, which makes an FPGA a recognised and popular hardware for IoT devices [6] This enables the FPGA an easy to interface platform by integrating temperature, pressure, position, acceleration, ADC and DAC converters, since these interfaces are required by new and smart “Things”.

Motivating offloading and acceleration at the edge for IoT use cases
Why offload to the edge?
Why use FPGAs for acceleration?
Why network functions on programmable hardware?
Use cases and requirements
Offloading VNFs at the edge
Video streaming testbed
10 Gbps Ethernet
Physical architecture and components
User‐end
Edge layer
Cloud layer
Evaluation scenarios and metrics
Evaluation results
Related work
Conclusion and research challenges
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call