Abstract

The aim of this study is to address the issue of TDOA/FDOA measurement accuracy in complex underwater environments, which is affected by multipath effects and variations in water sound velocity induced by the challenging nature of the underwater environment. To this end, a novel cooperative localisation algorithm has been developed, integrating the attention mechanism and convolutional neural network-bidirectional gated recurrent unit (CNN-BiGRU) with TDOA/FDOA and two-step weighted least squares (ImTSWLS). This algorithm is designed to enhance the accuracy of TDOA/FDOA measurements in complex underwater environments. The algorithm initially makes use of the considerable capacity of a convolutional neural network (CNN) to extract profound spatial and frequency domain characteristics from multimodal data. These features are of paramount importance for the characterisation of underwater signal propagation, particularly in complex environments. Subsequently, through the use of a bidirectional gated recurrent unit (BiGRU), the algorithm is able to effectively capture long-term dependencies in time series data. This enables a more comprehensive analysis and understanding of the changing pattern of signals over time. Furthermore, the incorporation of an attention mechanism within the algorithm enables the model to focus more on the signal features that have a significant impact on localisation, while simultaneously suppressing the interference of extraneous information. This further enhances the efficiency of identifying and utilising the key signal features. ImTSWLS is employed to resolve the position and velocity data following the acquisition of the predicted TDOA/FDOA, thereby enabling the accurate estimation of the position and velocity of the mobile radiation source. The algorithm was subjected to a series of tests in a variety of simulated underwater environments, including different sea states, target motion speeds and base station configurations. The experimental results demonstrate that the algorithm exhibits a deviation of only 2.88 m/s in velocity estimation and 2.58 m in position estimation when the noise level is 20 dB. The algorithm presented in this paper demonstrates superior performance in both position and velocity estimation compared to other algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.