Abstract

The phase measurement error in an interferometric wavelength shift measurement scheme, such as that used in association with in-fiber Bragg grating sensors, has been investigated experimentally with appropriate underpinning analytical theory. It has been shown that when a lock-in amplifier is used to detect the phase shift generated by the Bragg wavelength shift, a pseudoperiodical measurement error can be introduced owing to the difference between the amplitude of the optical path difference ramp and the value of the Bragg wavelength. If the initial ramp deviation equals 20 nm, the measurement error may be as large as +/-2.5% of the total measurement range. With a double-phase lock-in amplifier approach to measure the ac strain, the measurement error can be decreased to 0.4% of the total measurement range. With the real-time measured period that corresponds to the Bragg wavelength with the distorted carrier signal of the interferometer as the reference period of a digital lock-in amplifier, the effect of the initial ramp deviation can be principally avoided, and the measurement error can be kept to an acceptably low level, about 0.1% of the total measurement range.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call