The inability to perfectly know the system noise properties to infinite precision, referred to as noise uncertainty, results in noise power calibration errors that have been proven to impose fundamental limitations on the detection performance of any spectrum sensing (signal detection) method in cognitive radio networks. In this work we argue that the inability of cognitive radio users to perfectly know beforehand the primary signals that might be present in the sensed band and their properties, referred to as signal uncertainty in this work, also results in an additional detection performance degradation. The noise uncertainty consequences have widely been studied, verified experimentally and distilled into tractable mathematical models. However, the potential effects of the particular primary signal properties on the resulting detection probability of generic spectrum sensing algorithms, such as energy detection, have not been taken into account in the analysis and performance evaluation of spectrum sensing in cognitive radio networks. In this context, this work develops a mathematical model for signal uncertainty and, based on such model, analyzes the impact of signal uncertainty on the resulting detection performance of spectrum sensing, with and without noise uncertainty, and compares the practical consequences of both degrading effects.