The meaningful comparison of models of galaxy evolution to observations is critically dependent on the accurate treatment of dust attenuation. To investigate dust absorption and emission in galaxies we have assembled a sample of ~1000 galaxies with UV through IR photometry from GALEX, SDSS, and Spitzer, and optical spectroscopy from SDSS. The ratio of IR to UV emission (IRX) is used to constrain the dust attenuation in galaxies. We use the 4000 A break as a robust and useful, although coarse, indicator of star formation history (SFH). We examine the relationship between IRX and the UV spectral slope (a common attenuation indicator at high redshift) and find little dependence of the scatter on D_n(4000). We construct average UV through far-IR spectral energy distributions (SEDs) for different ranges of IRX, D_n(4000), and stellar mass (M_*) to show the variation of the entire SED with these parameters. When binned simultaneously by IRX, D_n(4000), and M_* these SEDs allow us to determine a low-resolution average attenuation curve for different ranges of M_*. The attenuation curves thus derived are consistent with a λ^(−0.7) attenuation law, and we find no significant variations with M_*. Finally, we show the relationship between IRX and the global stellar mass surface density and gas-phase metallicity. Among star-forming galaxies we find a strong correlation between IRX and stellar mass surface density, even at constant metallicity, a result that is closely linked to the well-known correlation between IRX and star formation rate.