Process resilience represents a core competence for organizations in light of an increasing number of process disruptions, such as sudden increases in case arrivals or absences in the workforce. It reflects an organization’s ability to restore a process to its acceptable performance level after a disruption. In this regard, the first key step for organizations towards achieving resilience is to understand how resilient their processes actually are. Although recognized as important, few works focus on such resilience assessment in a data-driven manner, thus barring organizations from gaining the necessary insights into how much their processes are affected by disruptions and how long it takes them to recover. To address this problem, we propose an approach for automated resilience assessment, based on recorded event data. Our approach interprets relevant process characteristics, such as the average lead time or arrival rate, as time series, which capture the development of the process execution over time. Based on these time series, it uses statistical modeling, specifically a vector autoregressive model, to determine the inter-relations between those characteristics and assess how the process performance responds to a disruption, i.e., a significant and temporal change in one of the process characteristics. We validate our approach by comparing its accuracy with a what-if analysis using a simulation model and demonstrate its effectiveness by assessing the resilience of the same process to diverse disruptions across different organizations.
Read full abstract