Abstract

With the exponential growth of IoT terminals, smartphones, and wearable devices, traditional centralized cloud computing models have been unable to efficiently process data generated by edge devices. In order to meet the challenges, edge computing has been proposed. The location close to the terminal device meets the high computing volume, low latency, privacy requirements of deep learning on edge devices, bandwidth advantages, efficiency and scalability. We first introduced the background and motivation for running AI at the edge of the network, reviewed the basic concepts of deep learning, and then we provided the overall architecture of edge computing based on deep learning. We discussed three computing and inference models on terminal devices, edge servers and cross-edge devices, and describes the method to improve and optimize the edge deep learning model. Finally, we discuss the application scenarios and future opportunities of edge deep learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call