This paper proposes a method for interpretation of the emotions detected in facial expressions in the context of the events that cause them. The method was developed to analyze the video recordings of facial expressions depicted during a collaborative game played as a part of the Mars-500 experiment. In this experiment, six astronauts were isolated for 520 days in a space station to simulate a flight to Mars. Seven time-dependent components of facial expressions were extracted from the video recordings of the experiment. To interpret these dynamic components, we proposed a mathematical model of emotional events. Genetic programming was used to find the locations, types, and intensities of the emotional events as well as the way the recorded facial expressions represented reactions to them. By classification of different statistical properties of the data, we found that there are significant relations between the facial expressions of different crew members and a memory effect between the collective emotional states of the crew members. The model of emotional events was validated on previously unseen video recordings of the astronauts. We demonstrated that both genetic search and optimization of the parameters improve the accuracy of the proposed model. This method is a step toward automating the analysis of affective expressions in terms of the cognitive appraisal theory of emotion, which relies on the dependence of the expressed emotion on the causing event.