The extraordinary sensitivity of the mammalian inner ear has captivated scientists for decades, largely due to the crucial role played by the outer hair cells (OHCs) and their unique electromotile properties. Typically arranged in three rows along the sensory epithelium, the OHCs work in concert via mechanisms collectively referred to as the "cochlear amplifier" to boost the cochlear response to faint sounds. While simplistic views attribute this enhancement solely to the OHC-based increase in cochlear gain, the inevitable presence of internal noise requires a more rigorous analysis. Achieving a genuine boost in sensitivity through amplification requires that signals be amplified more than internal noise, and this requirement presents the cochlea with an intriguing challenge. Here we analyze the effects of spatially distributed cochlear-like amplification on both signals and internal noise. By combining a straightforward mathematical analysis with a simplified model of cochlear mechanics designed to capture the essential physics, we generalize previous results about the impact of spatially coherent amplification on signal degradation in active gain media. We identify and describe the strategy employed by the cochlea to amplify signals more than internal noise and thereby enhance the sensitivity of hearing. For narrow-band signals, this effective, wave-based strategy consists of spatially amplifying the signal within a localized cochlear region, followed by rapid attenuation. Location-dependent wave amplification and attenuation meet the necessary conditions for amplifying near-characteristic frequency (CF) signals more than internal noise components of the same frequency. Our analysis reveals that the sharp wave cutoff past the CF location greatly reduces noise contamination. The distinctive asymmetric shape of the "cochlear filters" thus underlies a crucial but previously unrecognized mechanism of cochlear noise reduction.