Abstract

AbstractDuring the years that I have led process hazard analyses (PHA), I have repeatedly encountered team members who adamantly proclaimed that a scenario that I have proposed is invalid or too farfetched. Their reasoning is based on the idea that “It can't happen here” or “We have operated for twenty years and that's never happened.” This paper will focus on three examples of the above reasoning. The first and second will examine the "it can't happen here" attitude and the third will explore the "it hasn't happened so it won't happen" mindset. Through analyzing both of these scenarios, this paper will delve into the development of human bias that leads to these determinantal attitudes. Each example from an actual PHA Team will illustrate the prevalence of these type of human bias. It will also provide recommendations for addressing these two particular types of bias in a process hazard analysis. Failure is an inherent product of these human biases and this paper will seek to address why people fail to learn from prior incidents at their own company and those from industry. Each example from an actual PHA Team will illustrate these types of human bias. It will also provide some recommendations for addressing these two types of bias in a process hazard analysis. Using the examples from real‐life PHAs to illustrate the concepts of human bias and how they can affect the ability of a PHA team to adequately assess risk, will provide insight on how this can occur and how it can be addressed. The overall objective of this paper is to help develop a better understanding of how human bias can lead to failure to learn from prior incidents and is reflective of Process Safety Culture. Overcoming and addressing human bias supports learning from other incidents and improve the assessment of risk.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call