Abstract

In the last decade, deep neural network (DNN)-based object detection technologies have received significant attention as a promising solution to implement a variety of image understanding and video analysis applications on mobile edge devices. However, the execution of computationally intensive DNN-based object detection workloads in mobile edge devices is insufficient in fulfilling the object detection requirements with high accuracy and low latency, owing to the limited computation capacity. In this paper, we implement and evaluate a DNN-based object detection offloading framework to improve the object detection performance of mobile edge devices by offloading computation-intensive workloads to a remote edge server. However, preliminary experimental results have shown that offloading all object detection workloads of mobile edge devices may lead to worse performance than executing the workloads locally. This degradation is obtained from the inefficient resource utilization in the edge computing architectures, both for the edge server and mobile edge devices. To resolve the aforementioned problem with degradation, we devise a device-aware DNN offloading decision algorithm that is aimed to maximize resource utilization in the edge computing architecture. The proposed algorithm decides whether or not to offload the object detection workloads of edge devices by considering their computing power and network bandwidth, and therefore maximizing their average object detection processing frames per second. Through various experiments conducted in a real-life wireless local area network (WLAN) environment, we verified the effectiveness of the proposed DNN-based object detection offloading framework.

Highlights

  • With an explosive increase of deep learning technologies in the last decade, object detection with deep neural networks (DNNs) has made a great impact on performance improvement in terms of detection accuracy and response time [1]

  • We focus on a binary object detection offloading method, in which the object detection workloads can be fully offloaded to the edge server or executed locally at the mobile edge devices

  • In this paper, we introduced a DNN-based object detection offloading framework in an edge computing infrastructure consisting of multiple mobile edge devices and a remote edge server

Read more

Summary

INTRODUCTION

With an explosive increase of deep learning technologies in the last decade, object detection with deep neural networks (DNNs) has made a great impact on performance improvement in terms of detection accuracy and response time [1]. The traditional approach for providing powerful computing capabilities to mobile edge devices is to utilize cloud computing services through wireless networks It requires a large volume of data transmission via a long wide-area network, resulting in a long and volatile end-toend latency [6]. In the worst case scenario, which has a relatively small data rate, executing object detection workloads locally may have improved detection performance than offloading them, indicating that offloading object detection workloads of all mobile edge devices does not guarantee the best performance in terms of average frames per second (FPS) These problems are incurred by the inefficient resource utilization in the edge computing architecture, both for the edge server and mobile edge devices.

RELATED WORK
EXPERIMENT SETUP
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call