Abstract

As today's engineering systems have become increasingly sophisticated, assessing the efficacy of their safety-critical systems has become much more challenging. The more classical methods of "failure" analysis by decomposition into components related by logic trees, such as fault and event trees, root cause analysis, and failure mode and effects analysis lead to models that do not necessarily behave like the real systems they are meant to represent. These models need to display similar emergent and unpredictable behaviors to sociotechnical systems in the real world. The question then arises as to whether a return to a simpler whole system model is necessary to understand better the behavior of real systems and to build confidence in the results. This question is more prescient when one considers that the causal chain in many serious accidents is not as deep-rooted as is sometimes claimed. If these more obvious causes are not taken away, why would the more intricate scenarios that emanate from more sophisticated models be acted upon. The paper highlights the advantages of modeling and analyzing these "normal" deviations from ideality, so called weak signals, versus just system failures and near misses as well as catastrophes. In this paper we explore this question.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.