Abstract
Computer-intensive tasks can be distributed to edge devices in edge computing, which has many benefits, including distributed processing, low latency, better security, privacy, etc. In traditional video surveillance, video is cached in the data center for post analysis and processing, but it is impossible to analyze the specific behaviors of specific objects intelligently in real-time scene. Taking this into consideration, we propose an efficient object detection and tracking framework via edge computing, which distributes the object detection and tracking tasks to edge cameras within the network instead of sending all the data to a centralized server. Its advantages can be summarized as follows: (1) instead of analyzing and processing video in the cloud, we analyze them locally, which greatly reduces the bandwidth and delay caused by the transmission of large amounts of data and ensures the security of the data. (2) we propose a neural network model compression framework for object detection with less storage and computing cost, which reduces the number of parameters and floating point operations by 39.3% and 17.28%. We use the detection results for tracking, combined with a simple tracking method, we can achieve a real-time object detection and tracking framework and more quickly determine the type and trajectory of moving objects. (3) We propose a communication module, which enables data interaction among cameras, and makes it easier for us to find the moving range of the object in the cross-camera area. In the experiment, network cameras and edge devices were used to build a framework, and an offline inference method was used to complete the detection and tracking of objects and judge the motion trajectory to verify the feasibility of the framework.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.