Abstract
The current times ask for resource-constrained devices such as drones, light mobile robots, XR glasses, or mobile phones to perform object detection efficiently and in real time. However, when executed on the device, object detection fails to achieve the accuracy requirements. In addition, depending on a powerful edge to offload computation is time-consuming as it requires the transmission of a large amount of data and limits the usage in degraded network conditions. In this paper, we propose OffloaD, a novel offloading scheduler based on the detection of failures for object detection. The method addresses the problem of offloading between a resource-constrained device and an edge by leveraging the accuracy of the object detection model at the device and the network conditions. Accuracy is considered through a detection failure metric that consists of introspective, golden and latency scores. Our main contribution is the offloading scheduler based on the detection of failures of the object detector, which reduces unnecessary data transmission without accuracy degradation. Our method demonstrates that offloading to expensive detectors will not always increase performance and enables detection results and failure estimations even if the network is degraded or unavailable. We implement OffloaD in a physical testbed and perform extensive comparisons against several baselines. The experiments suggest that our approach improves the overall precision-latency performance by reducing the end-to-end latency between 18% and 37% in ideal network conditions and a low latency increase in degraded network conditions while obtaining an accuracy with minimum degradation compared to the baselines. In addition, comparisons to related work methods demonstrate the state-of-the-art performance of OffloaD.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.