Abstract

In recent years, the rapid development of the courier industry, in order to solve the problem of the accumulation of goods in courier warehouses, low efficiency of manual sorting and transportation, many courier companies have introduced intelligent unmanned warehouse logistics systems for unmanned transport vehicles or unmanned inspection The study proposes a recognition method based on the combination of Res2Net-YOLACT and HSV to achieve accurate recognition of the route trajectory of unmanned vehicles in the logistics warehouse environment. After the detection of the YOLACT network with improved backbone network, the area of the track is extracted by HSV, and the extracted area is used as the object of secondary processing, i.e., the extracted track is then subjected to centerline extraction to form the actual track. By comparing the recognition effect of this algorithm on the test set under different direction shapes of tracks, the correct rate of Res2Net-YOLACT in the same experimental environment is 97.37%, and the speed of detecting a single image is 30.26 ms, and then the centerline extraction algorithm is ported to the Res2Net-YOLACT network, and the speed of detecting a single image is 37.26 ms. Compared with the 93.28% accuracy of the original YOLACT network, the speed of single image detection is improved by 3.6%. In addition, the centerline extraction algorithm designed using this study is less computationally intensive and less code engineering than the prevailing algorithm, increasing the memory footprint by only 3.2%. To verify the performance of this centerline extraction algorithm, a comparison between this algorithm and the commonly available algorithms was performed, which showed a 1.31% improvement in correctness and an 8.73% improvement in the speed of detecting videos compared to the general algorithm, indicating that the centerline extraction algorithm processed by this study has higher accuracy and real-time performance without significantly consuming more memory. In addition, to test the practicality of the algorithm, the time used by the algorithm to detect the same video on the embedded device jetson nano was counted, and the average frame rate was calculated to be 28FPS and the maximum frame rate was 33FPS, which can be achieved in real transportation applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call