Abstract

Objective: What metaphors, models and theories were developed in the safety science domain? And which research was based upon ‘big data’?Method: The study was confined to original articles and documents, written in English or Dutch from the period under consideration.Results and conclusions: From the start of the 20th century, human error was a dominant explanation for causes of occupational accidents. Although external factors were seen as main contributors, it was not till after World War II when scenario analysis was conducted in detail. The main drivers were the upscaling op the process industry in this period, as well as the introduction of high hazardous industries, like aerospace and nuclear sector, and consequently disasters occurring in these sectors. Already from the beginning, big data research was no exception in the safety science domain. ‘Big’ in this context is defined by numbers.

Highlights

  • Big data is a fashionable term among scientists, marketers, forecasters and safety experts

  • What metaphors, models and theories were developed in the safety science domain? And which research was based upon ‘big data’? Method: The study was confined to original articles and documents, written in English or Dutch from the period under consideration

  • Already from the beginning, big data research was no exception in the safety science domain

Read more

Summary

Introduction

Big data is a fashionable term among scientists, marketers, forecasters and safety experts. The huge amount of data allowed him to develop a theory that postulated that human behaviour and unsafe acts are response reactions on the part of workers during process disturbances; such behaviour and acts were a consequence of context and not a cause of accidents He was the father of ‘task dynamics theory’ (Swuste et al, 2014; Winsemius, 1951). Kaplan, and the Reactor Safety Study WASH-1400 developed a method to try to estimate risks based on failure data which was gathered on a huge scale in some industries This information could be used in the new risk formula, the risk triplet, which combined major accident scenarios with the deterministic approach and the probabilistic approach (Rasmussen, 1975; Kaplan & Garrick, 1981):. Risk management should be focussed on understanding the dynamics of the safety of processes and the need for stakeholders to determine the boundaries and gain insight through feedback control, into when a state of ‘drift to danger’ occurs (Svedung & Rasmussen, 2002)

Discussion and conclusions
Findings
Notes on contributor
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call