Analysis is performed on ultra-high resolution large-scale cosmological radiation-hydrodynamic simulations to, for the first time, quantify the physical environment of long-duration gamma-ray bursts (GRBs) at the epoch of reionization. We find that, on parsec scales, 13% of GRBs remain in high density ($\ge 10^4$cm$^{-3}$) low-temperature star-forming regions, whereas 87% of GRBs occur in low-density ($\sim 10^{-2.5}$cm$^{-3}$) high temperature regions heated by supernovae. More importantly, the spectral properties of GRB afterglows, such as the neutral hydrogen column density, total hydrogen column density, dust column density, gas temperature and metallicity of intervening absorbers, vary strongly from sightline to sightline. Although our model explains extant limited observationally inferred values with respect to circumburst density, metallicity, column density and dust properties, a substantially larger sample of high-z GRB afterglows would be required to facilitate a statistically solid test of the model. Our findings indicate that any attempt to infer the physical properties (such as metallicity) of the interstellar medium of the host galaxy based on a very small number of (usually one) sightlines would be precarious. Utilizing high-z GRBs to probe interstellar medium and intergalactic medium should be undertaken properly taking into consideration the physical diversities of the interstellar medium.
Read full abstract