Abstract

Recently, a temporal adaptive matched filtering technique has been proposed to reduce the background noise in phase-sensitive optical time domain reflectometry (ɸ-OTDR) systems deployed for fibre optic distributed acoustic sensing (DAS) applications. This technique utilizes noise covariance matrix inversion which is estimated from signal-free assumed training samples. The detection performance degradation due to desired signal presence in the training data and the illconditioning of the matrix because of the limited training data are conventionally avoided by adding some amount of white noise on the diagonal elements of the covariance matrix. In this paper, the optimum level for this diagonal loading (DL) is investigated for DAS applications and verified with real ɸ -OTDR data. The optimum DL level which has been analytically proven in beamforming applications is shown to be applicable in the temporal domain noise filtering for DAS applications. The experimental results validate that the optimum DL value is negative.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call