Abstract

The paper presents research related to coastal observation using a camera and LiDAR (Light Detection and Ranging) mounted on an unmanned surface vehicle (USV). Fusion of data from these two sensors can provide wider and more accurate information about shore features, utilizing the synergy effect and combining the advantages of both systems. Fusion is used in autonomous cars and robots, despite many challenges related to spatiotemporal alignment or sensor calibration. Measurements from various sensors with different timestamps have to be aligned, and the measurement systems need to be calibrated to avoid errors related to offsets. When using data from unstable, moving platforms, such as surface vehicles, it is more difficult to match sensors in time and space, and thus, data acquired from different devices will be subject to some misalignment. In this article, we try to overcome these problems by proposing the use of a point matching algorithm for coordinate transformation for data from both systems. The essence of the paper is to verify algorithms based on selected basic neural networks, namely the multilayer perceptron (MLP), the radial basis function network (RBF), and the general regression neural network (GRNN) for the alignment process. They are tested with real recorded data from the USV and verified against numerical methods commonly used for coordinate transformation. The results show that the proposed approach can be an effective solution as an alternative to numerical calculations, due to process improvement. The image data can provide information for identifying characteristic objects, and the obtained accuracies for platform dynamics in the water environment are satisfactory (root mean square error—RMSE—smaller than 1 m in many cases). The networks provided outstanding results for the training set; however, they did not perform as well as expected, in terms of the generalization capability of the model. This leads to the conclusion that processing algorithms cannot overcome the limitations of matching point accuracy. Further research will extend the approach to include information on the position and direction of the vessel.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.