Intensity interferometry, based on the Hanbury Brown-Twiss effect, is a simple and inexpensive method for optical interferometry at microarcsecond angular resolutions; its use in astronomy was abandoned in the 1970s because of low sensitivity. Motivated by recent technical developments, we argue that the sensitivity of large modern intensity interferometers can be improved by factors up to approximately 25,000, corresponding to 11 photometric magnitudes, compared to the pioneering Narrabri Stellar Interferometer. This is made possible by (i) using avalanche photodiodes (APD) as light detectors, (ii) distributing the light received from the source over multiple independent spectral channels, and (iii) use of arrays composed of multiple large light collectors. Our approach permits the construction of large (with baselines ranging from few kilometers to intercontinental distances) optical interferometers at the cost of (very) long-baseline radio interferometers. Realistic intensity interferometer designs are able to achieve limiting R-band magnitudes as good as ~14, sufficient for spatially resolved observations of main-sequence O-type stars in the Magellanic Clouds. Multi-channel intensity interferometers can address a wide variety of science cases: (i) linear radii, effective temperatures, and luminosities of stars; (ii) mass-radius relationships of compact stellar remnants; (iii) stellar rotation; (iv) stellar convection and the interaction of stellar photospheres and magnetic fields; (v) the structure and evolution of multiple stars; (vi) direct measurements of interstellar distances; (vii) the physics of gas accretion onto supermassive black holes; and (viii) calibration of amplitude interferometers by providing a sample of calibrator stars.
Read full abstract