Phase-contrast imaging methods exploit variations in an object's refractive index distribution to permit the visualization of subtle features that may have very similar optical absorption properties. Although phase-contrast is often viewed as being desirable in many biomedical applications, its relative influence on signal detectability when both absorption- and phase-contrast are present remains relatively unexplored. In this work, we investigate the ideal Bayesian observer signal-to-noise ratio in phase-contrast imaging for a signal-known-exactly/background-known exactly detection task involving a weak signal. We demonstrate that this signal detectability measure can be decomposed into three contributions that have distinct interpretations associated with the imaging physics.