Abstract

Fiber optic seismometers have been widely used for detecting ground motion signals in seismic monitoring applications. However, the accuracy of these seismometers can be significantly affected by the introduction of carrier phase delay and modulation depth offset in the out-of-carrier modulation process. This article address this challenge. MOSF utilizes a multi-stage orthogonal signal fusion calculation method to simultaneously estimate the non-linear errors introduced by phase delay (θ) and modulation depth (C). By compensating for these errors, MOSF achieves accurate demodulation results. In the first stage, MOSF estimates the phase delay θ. In the second stage, it compensates for the phase delay of the interfering signal and calculates the modulation depth C from the compensated signal. To evaluate the performance of MOSF, we conduct simulations and experiments on demodulated signals with different phase delays (θ) and modulation depths (C). The results show that the MOSF method can accurately calculate and compensate for (θ) and (C), with a low noise level of 2.98μrad/√Hz and a maximum detection range of 115.46 re rad/√(Hz) @800 Hz. The MOSF algorithm achieves an SINAD of 95 dB and a THD of −98 dB for different modulation depths and phase delays, with the phase delay of the demodulated signal stable between 0 and 1°. This demonstrates the robustness of the MOSF method to modulation depth and phase delay, as well as its real-time performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call