Abstract

In this paper, a Bayesian filtering approach to adaptively extracting the crossed time-frequency (TF) ridges of ultrasonic guided waves (GWs) and retrieving their overlapped modes is proposed. Based on the generalized non-parametric GW signal model, the phase evolution of each overlapped mode can be described by the state transition equation developed by a polynomial prediction model. When an analyzed GW in the frequency domain is viewed as the measurement equation of the states, a state space model in the frequency domain for describing the GW and its modes is established. As a result, a Bayesian filtering approach can be used to extract the crossed TF ridges and separate the overlapped modes in an analyzed GW when the mode number in the signal is known as a priori. When such a priori is unavailable, an adaptive detection method of the mode number in a GW is acquired by a non-parametric iterative adaptive estimation scheme. In this way, the proposed method can be applied to the cases where an analyzed GW with unknown modes can also be extracted and separated accurately. Simulation results show that the proposed method can correctly extract the crossed TF ridges and separate the overlapped modes when the signal-to-noise ratio (SNR) is higher than -5 dB. In the steel plate experiment, the correlation coefficients of S0, A0, and A1 modes between the original and retrieved signals are 0.900, 0.772, and 0.915 respectively, which are over the reported results in the literature.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.