Abstract
In the practical application scenarios of USVs, it is necessary to identify a vessel in order to accomplish tasks. Considering the sensors equipped on the USV, visible images provide the fastest and most efficient way of determining the hull number. The current studies divide the task of recognizing vessel plate number into two independent subtasks: text localization in the image and its recognition. Then, researchers are focusing on improving the accuracy of localization and recognition separately. However, these methods cannot be directly applied to USVs due to the difference between these two application scenarios. In addition, as the two independent models are serial, there will be inevitable propagation of error between them, as well as an increase in time costs, resulting in a less satisfactory performance. In view of the above, we proposed a method based on object detection model for recognizing vessel plate number in complicated sea environments applied to USVs. The accuracy and stability of model have been promoted by recursive gated convolution structure, decoupled head, reconstructing loss function, and redesigning the sizes of anchor boxes. To facilitate this research, a vessel plate number dataset is established in this paper. Furthermore, we conducted a experiment utilizing a USV platform in the South China Sea. Compared with the original YOLOv5, the mAP (mean Average Precision) value of proposed method is increased by 6.23%. The method is employed on the "Tian Xing" USV platform and the experiment results indicates both the ship and vessel plate number can be recognized in real-time. In both the civilian and military sectors, this has a great deal of significance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.