Abstract

ABSTRACT While the slope of the dust attenuation curve (δ) is found to correlate with effective dust attenuation (AV) as obtained through spectral energy distribution (SED) fitting, it remains unknown how the fitting degeneracies shape this relation. We examine the degeneracy effects by fitting SEDs of a sample of local star-forming galaxies (SFGs) selected from the Galaxy And Mass Assembly survey, in conjunction with mock galaxy SEDs of known attenuation parameters. A well-designed declining starburst star formation history is adopted to generate model SED templates with intrinsic UV slope (β0) spanning over a reasonably wide range. The best-fitting β0 for our sample SFGs shows a wide coverage, dramatically differing from the limited range of β0 < −2.2 for a starburst of constant star formation. Our results show that strong degeneracies between β0, δ, and AV in the SED fitting induce systematic biases leading to a false AV–δ correlation. Our simulation tests reveal that this relationship can be well reproduced even when a flat AV–δ relation is taken to build the input model galaxy SEDs. The variations in best-fitting δ are dominated by the fitting errors. We show that assuming a starburst with constant star formation in SED fitting will result in a steeper attenuation curve, smaller degeneracy errors, and a stronger AV–δ relation. Our findings confirm that the AV–δ relation obtained through SED fitting is likely driven by the systematic biases induced by the fitting degeneracies between β0, δ, and AV.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call