We use simulated galaxy observations from the NIHAO-SKIRT-Catalog to test the accuracy of spectral energy distribution (SED) modeling techniques. SED modeling is an essential tool for inferring star formation histories from nearby galaxy observations but is fraught with difficulty due to our incomplete understanding of stellar populations, chemical enrichment processes, and the nonlinear, geometry-dependent effects of dust. The NIHAO-SKIRT-Catalog uses hydrodynamic simulations and radiative transfer to produce SEDs from the ultraviolet (UV) through the infrared (IR), accounting for dust. We use the commonly used Prospector software to perform inference on these SEDs and compare the inferred stellar masses and star formation rates (SFRs) to the known values in the simulation. We match the stellar population models to isolate the effects of differences in the star formation history, the chemical evolution history, and the dust. For the high-mass NIHAO galaxies (>109.5 M ⊙), we find that model mismatches lead to inferred SFRs that are on average underestimated by a factor of 2 when fit to UV through IR photometry, and a factor of 3 when fit to UV through optical photometry. These biases lead to significant inaccuracies in the resulting specific SFR–mass relations, with UV through optical fits showing particularly strong deviations from the true relation of the simulated galaxies. In the context of massive existing and upcoming photometric surveys, these results highlight that star formation history inference from photometry may remain imprecise and inaccurate and that there is a pressing need for more realistic testing of existing techniques.
Read full abstract