Abstract

The sensitivity of second-generation interferometric Gravitational-Wave detectors is limited in the low frequency region by Newtonian Noise from seismic fields. Fluctuations in the local gravitational field due to mass density variations results in Newtonian Noise in the interferometer data. To subtract the effects of this noise, an array of seismometers is placed around the mirrors of the interferometer to monitor the noise source. Optimal positioning of these seismometers will result in maximal subtraction of Newtonian Noise. So far the sensor positions were optimized for a single seismic wave frequency. But in reality, the Newtonian Noise at detector site is substantial over the frequency band (8 – 20) Hz. In this paper, the sensor placement optimization problem is formulated as a multi-objective optimization problem, to ensure that the sensor positions are optimized over a broad range of frequencies. The results show a significant improvement from the single objective optimization case currently in use, and is limited only by the seismometer self-noise.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call