Abstract

With the rapid development of artificial intelligence, neural network is widely used in various fields. Target detection algorithm is mainly based on neural network, but the accuracy of target detection algorithm is greatly related to the complexity of scene and texture. A target detection algorithm based on RGB-D image from the perspective of the lightweight of target detection network model and the integration of depth map to overcome the weak environmental illumination with self-powered sensors information is proposed. This paper analyzes the network model structure of YOLOv4 and MobileNet, compares the variation of parameter number between depthwise separable convolution and convolutional neural network, and combines the advantages of YOLOv4 network and MobileNetv3 network. The main network of three effective feature layers in YOLOv4 is replaced by MobileNetv3 network for initial feature layer extraction to strengthen the feature extraction network. At the same time, the standard convolution models in the network are replaced by depthwise separable convolution. The proposed method is compared with YOLOv4 and YOLOv4-MobileNetv3 in this paper, and the experimental results show that the proposed network retains its original advantages in accuracy, but the size of the network model is about 23% of that of YOLOv4 network model, and the processing speed is about 42% higher than that of YOLOv4 network model, and the detection accuracy can still reach 83% in the environment with poor lighting conditions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.