Abstract

Gravitational-wave detectors, such as Virgo, LIGO, and KAGRA, are modified Michelson interferometers, with a system of coupled Fabry–Pérot cavities, to increase its sensitivity and bandwidth. In order to control the detector, several radio frequency sidebands, not resonant in the kilometric arm cavities but resonant in the central cavities of the interferometer, are added to the carrier frequency to extract longitudinal and alignment error signals. Misalignment of the laser in the Fabry–Pérot cavities causes sensitivity degradation through different mechanisms and results in non-superposition of carrier and sidebands. These relative misalignment between fields at different frequency contain clues to optimally align the interferometer, but the question of the direction of a reflected beam by a Fabry–Pérot cavity, as a function of the state of resonance of the incoming electromagnetic field, is neither straightforward nor intuitive. While numerical optical simulations used in the gravitational-wave detector community are able to answer the question, they do not give a qualitative and handy understanding of the observed phenomenon, useful for the commissioning and operation of the detectors. In this Letter, we present a model based on Gaussian beam first-order modal expansion to calculate analytically how misalignment on the input beam in a Fabry–Pérot cavity translates into misalignment of the reflected and circulating beams. We find a strong dependence not only on the beam resonance condition but also on the mirror geometry. Finally, we checked the consistency of our model by comparing its predictions with existing numerical simulators.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call