Abstract

The cutting-edge imaging system exhibits low output resolution and high power consumption, presenting challenges for the RGB-D fusion algorithm. In practical scenarios, aligning the depth map resolution with the RGB image sensor is a crucial requirement. In this Letter, the software and hardware co-design is considered to implement a lidar system based on the monocular RGB 3D imaging algorithm. A 6.4 × 6.4-mm2 deep-learning accelerator (DLA) system-on-chip (SoC) manufactured in a 40-nm CMOS is incorporated with a 3.6-mm2 TX-RX integrated chip fabricated in a 180-nm CMOS to employ the customized single-pixel imaging neural network. In comparison to the RGB-only monocular depth estimation technique, the root mean square error is reduced from 0.48 m to 0.3 m on the evaluated dataset, and the output depth map resolution matches the RGB input.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call