Training simulators are an important but under-utilized resource to understand human behavior in man-machine systems. They provide a realistic model of the work environment, trained subjects of varying experience levels, and permit data collection on abnormal situations that arise infrequently in the real world. However, the methods available to tap this resource are limited because of requirements imposed by training program goals. The result is a test situation that more closely approximates naturalistic observation conditions than controlled experimental design. One type of solution has been to develop automated data collection systems. While these systems record the details of plant behavior and specific operator actions (what happened), they fail to capture the context behind these results (how or why it happened). For example, in order to make interface design improvements, it is not enough to note operator errors; the mechanisms that produced the error must also be understood. An alternative approach is the decision analysis framework for the observation and analysis of human performance in man-machine systems. The decision analysis methodology, as applied to training simulator studies, has been developed and refined through several recent investigations of operator performance in nuclear power plant control rooms. These studies include an analysis of operator decision making during multiple failures, analyses of the sources of operator malfunctions, and an evaluation of a new concept in operator aids. The decision analysis process consist of two major steps. First, a description of actual performance (i.e., timeline or protocol) is produced by using the knowledge of subject matter experts to create situation specific flowsheets. These flowsheets must be detailed enough so that the protocol can be produced during the training sessions without extra or specialized observers, but they also need to be flexible enough to track unexpected evolutions that result from novel or incorrect operator behavior. The flowsheet technique provides for on-line protocol generation. The second activity in the decision analysis framework is to use the knowledge of human behavior (operator models, diagnostic strategies, human error taxonomies) provided by human factors specialists to extract a description of prototypical performance. At this level, the analysis is no longer related to a specific situation and specific operators, but emphasizes what is characteristic of a number of related performances. For example, where the actual performance description notes independent errors by different operators in different events, the prototypical performance description may identify a group of examples of a single human error category (e.g., failures to obtain feedback on goal achievement following an action). Efficient production of prototypical performance descriptions is based on translating the conceptual framework provided by the human factors specialist into a form usable by training instructors. One benefit of the decision analysis method is that data is collected not only on operator actions but also on their context. This provides design basis data by identifying where and why operator failures occur and a mechanism to evaluate modifications to the operational system (e.g., new equipment or procedures). Second, decision analysis permits this type of data to be collected on student performance as a standard part of simulator exercises. The technique can also help reduce the high resource overhead of human performance studies. In addition, the conceptual framework employed in decision analysis permits results to be generalized across a variety of context specific factors. Finally, the quality of training may be improved through better feedback on student performance.