ABSTRACT An experiment was conducted to test the common assumption that amplification produced by the outer and middle ear for air-conducted (AC) signals renders the perceptual contributions from bone conduction (BC) through the skull negligible under typical, free-field listening conditions. To evaluate this possibility, simulated BC signals reflecting varying levels of attenuation were mixed with the original version of the signal (i.e., in place of the AC signal). To simulate the BC signals, a sawtooth wave (F 0 = 43.06 Hz; maximum frequency = 5,000 Hz) was filtered using existing measurements from individual skulls as a bank of third-octave bands. The resulting signals were mixed with the original sawtooth, such that the relative peak amplitude of BC signals was −8, −20, −32, or −44 dB to the original signal. Listeners completed a 4IAX discrimination task to determine if they could detect the presence of the simulated BC signals (X) relative to the original sawtooth in isolation (A). Discrimination sensitivity (d´) tended to remain high at −8 dB and −20 dB conditions, and, as hypothesized, decreased with greater attenuation. However, d´ remained reliable (>1) for more than a quarter of the skulls, even in the −44 dB attenuation condition. This was despite the fact that third-octave bands below 3,150 Hz exceeded the combination of the expected levels of gain that are observed from AC and the measured attenuation from BC. Furthermore, remaining bands at or above 3,150 Hz reflected similar intensities for BC signals regardless of whether or not they were discriminated in the −44 dB condition. Taken together, these findings raise the possibility that at least a subset of human skulls might make audible contributions under typical (free-field) listening conditions, and certainly when the AC signal is reduced (such as when the ears are covered).