Previous research has suggested that race-specific features are automatically processed during face perception, often with out-group faces treated categorically. Functional imaging has illuminated the hemodynamic correlates of this process, with fewer studies examining single-neuron responses. In the present experiment, epilepsy patients undergoing microwire recordings in preparation for surgical treatment were shown realistic computer-generated human faces, which they classified according to the emotional expression shown. Racial categories of the stimulus faces varied independently of the emotion shown, being irrelevant to the patients’ primary task. Nevertheless, we observed race-driven changes in neural firing rates in the amygdala, anterior cingulate cortex, and hippocampus. These responses were broadly distributed, with the firing rates of 28% of recorded neurons in the amygdala and 45% in the anterior cingulate cortex predicting one or more racial categories. Nearly equal proportions of neurons responded to White and Black faces (24% vs. 22% in the amygdala and 26% vs. 28% in the anterior cingulate cortex). A smaller fraction (12%) of race-responsive neurons in the hippocampus predicted only White faces. Our results imply a distributed representation of race in brain areas involved in affective judgments, decision making, and memory. They also support the hypothesis that race-specific cues are perceptually coded even when those cues are task-irrelevant.