The optimal procedure for detecting the presence of discrete-time signals in additive noise can be derived from the likelihood ratio test. When the noise has statistically independent, identically distributed components, the dependence of the detector's performance on signal characteristics can be related to the Kullback-Leibler (KL) distance between the distributions governing the hypotheses. Performance predictions based on the central limit theorem are shown to be poor approximations to the true performance. Performance of the optimal detector has long been known to increase exponentially with increasing KL distance. Symmetric noise amplitude distributions yield a symmetric dependence on the difference between the signals' amplitudes at each time index. Small-signal (locally optimal) detection performance is shown to depend on signal energy, whereas large-signal performance depends on the signal waveform. When a distance measure can be defined, performance depends on a different measure than that used in the detector with one exception (the Gaussian). >