Abstract

In this paper we discuss that the phase readout in low noise laser interferometers can significantly deviate from the underlying optical pathlength difference (OPD). The cross coupling of beam tilt to the interferometric phase readout is compared to the OPD. For such a system it is shown that the amount of tilt to phase readout coupling depends strongly on the involved beams and their parameters, as well as on the detector properties and the precise definition of the phase. The unique single element photodiode phase is therefore compared to three common phase definitions for quadrant diodes. It is shown that neither phase definition globally shows the least amount of cross coupling of angular jitter.

Highlights

  • A frequent use of laser interferometers is to sense distance variations

  • This significant deviation originates from the phase- and intensity profiles of laser beams which cannot be sufficiently represented by plane waves or rays and secondly from incomplete beam detection and beam clipping on finite photodiodes as well as on the actual definition of the interferometric phase in the case of quadrant diodes

  • If either beam is clipped for instance by the slits of a quadrant photodiode (QPD) or because the diode is insufficiently large ( 3 times spot size on the detector), a significant coupling can occur (see Fig. 2(a))

Read more

Summary

Introduction

A frequent use of laser interferometers is to sense distance variations. Assume a very simple and perfect Mach-Zehnder interferometer where a mirror in one beam path is rotated around the beam reflection point. The OPD can be computed geometrically: it is the difference between the distances zMB and zp the measurement and reference beam propagate between pivot and diode (see Fig. 1): α2

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call