Abstract

We examine recent evidence from the luminosity-redshift relation of Type Ia Supernovae (SNe Ia) for the $\sim 3 \sigma$ detection of a ``Hubble bubble'' -- a departure of the local value of the Hubble constant from its globally averaged value \citep{Jha:07}. By comparing the MLCS2k2 fits used in that study to the results from other light-curve fitters applied to the same data, we demonstrate that this is related to the interpretation of SN color excesses (after correction for a light-curve shape-color relation) and the presence of a color gradient across the local sample. If the slope of the linear relation ($\beta$) between SN color excess and luminosity is fit empirically, then the bubble disappears. If, on the other hand, the color excess arises purely from Milky Way-like dust, then SN data clearly favors a Hubble bubble. We demonstrate that SN data give $\beta \simeq 2$, instead of the $\beta \simeq 4$ one would expect from purely Milky-Way-like dust. This suggests that either SN intrinsic colors are more complicated than can be described with a single light-curve shape parameter, or that dust around SN is unusual. Disentangling these possibilities is both a challenge and an opportunity for large-survey SN Ia cosmology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call