Abstract

As an essential part of Internet of Things, monocular depth estimation (MDE) predicts dense depth maps from a single red-green-blue (RGB) image captured by monocular cameras. Past MDE methods almost focus on improving accuracy at the cost of increased latency, power consumption, and computational complexity, failing to balance accuracy and efficiency. Additionally, when speeding up depth estimation algorithms, researchers commonly ignore their adaptation to different hardware architectures on edge devices. This article aims to solve these challenges. First, we design an efficient MDE model for precise depth sensing on edge devices. Second, We employ a reinforcement learning algorithm and automatically prune redundant channels of MDE by finding a relatively optimal pruning policy. The pruning approach lowers model runtime and power consumption with little loss of accuracy through achieving a target pruning ratio. Finally, we accelerate the pruned MDE while adapting it to different hardware architectures with a compilation optimization method. The compilation optimization further reduces model runtime by an order of magnitude on hardware architectures. Extensive experiments confirm that our methods are effective for images of different sizes on two public datasets. The pruned and optimized MDE achieves promising depth sensing with a better tradeoff among model runtime, accuracy, computational complexity, and power consumption than the state of the arts on different hardware architectures.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.