Previous research has found that the categorization of emotional facial expressions is influenced by a variety of factors, such as processing time, facial mimicry, emotion labels, and perceptual cues. However, past research has frequently confounded these factors, making it impossible to ascertain how adults use this varied information to categorize emotions. The current study is the first to explore the magnitude of impact for each of these factors on emotion categorization in the same paradigm. Participants (N = 102) categorized anger and disgust emotional facial expressions in a novel computerized task, modeled on similar tasks in the developmental literature with preverbal infants. Experimental conditions manipulated (a) whether the task was time-restricted, and (b) whether the labels "anger" and "disgust" were used in the instructions. Participants were significantly more accurate when provided with unlimited response time and emotion labels. Participants who were given restricted sorting time (2s) and no emotion labels tended to focus on perceptual features of the faces when categorizing the emotions, which led to low sorting accuracy. In addition, facial mimicry related to greater sorting accuracy. These results suggest that when high-level (labeling) categorization strategies are unavailable, adults use low-level (perceptual) strategies to categorize facial expressions. Methodological implications for the study of emotion are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).