Translating sensory information into perceptual decisions is a core challenge faced by the brain. This ability is understood to rely on weighting sensory evidence in order to form mental templates of the critical differences between objects. Learning is shown to optimize these templates for efficient task performance, but the neural mechanisms underlying this improvement remain unknown. Here, we identify the mechanisms that the brain uses to implement templates for perceptual decisions through experience. We trained observers to discriminate visual forms that were randomly perturbed by noise. To characterize the internal stimulus template that observers learn when performing this task, we adopted a classification image approach (e.g., [5-7]) for the analysis of both behavioral and fMRI data. By reverse correlating behavioral and multivoxel pattern responses with noisy stimulus trials, we identified the critical image parts that determine the observers' choice. Observers learned to integrate information across locations and weight the discriminative image parts. Training enhanced shape processing in the lateral occipital area, which was shown to reflect size-invariant representations of informative image parts. Our findings demonstrate that learning optimizes mental templates for perceptual decisions by tuning the representation of informative image parts in higher ventral cortex.