The mean emotion from multiple facial expressions can be extracted rapidly and precisely. However, it remains debated whether mean emotion processing is automatic which can occur under no attention. To address this question, we used a passive oddball paradigm and recorded event-related brain potentials when participants discriminated the changes in the central fixation while a set of four faces was presented in the periphery. The face set consisted of one happy and three angry expressions (mean negative) or one angry and three happy expressions (mean positive), and the mean negative and mean positive face sets were shown with a probability of 20% (deviant) and 80% (standard) respectively in the sequence, or the vice versa. The cluster-based permutation analyses showed that the visual mismatch negativity started early at around 92 ms and was also observed in later time windows when the mean emotion was negative, while a mismatch positivity was observed at around 168- 266 ms when the mean emotion was positive. The results suggest that there might be different mechanisms underlying the processing of mean negative and mean positive emotions. More importantly, the brain can detect the changes in the mean emotion automatically, and ensemble coding for multiple facial expressions can occur in an automatic fashion without attention.