Accepted Manuscript Cognitive biases in process hazard analysis Paul Baybutt PII:
S0950-4230(16)30167-X
DOI:
10.1016/j.jlp.2016.06.014
Reference:
JLPP 3248
To appear in:
Journal of Loss Prevention in the Process Industries
Received Date: 4 August 2015 Revised Date:
17 May 2016
Accepted Date: 22 June 2016
Please cite this article as: Baybutt, P., Cognitive biases in process hazard analysis, Journal of Loss Prevention in the Process Industries (2016), doi: 10.1016/j.jlp.2016.06.014. This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
ACCEPTED MANUSCRIPT
COGNITIVE BIASES IN PROCESS HAZARD ANALYSIS Paul Baybutt
[email protected]
Abstract
RI PT
Primatech Inc., Columbus, Ohio, USA
SC
Many decisions are made by process hazard analysis (PHA) teams in identifying hazard scenarios and determining if there is a need to reduce the risk of catastrophic
M AN U
accidents. Observations of PHA teams conducting studies indicated that such decisions are not always made rationally.
Cognitive psychologists have studied how people make decisions and the conditions under which those decisions may be unreliable. It has been shown that
D
various cognitive biases influence decisions by people and can hinder rationality. This
TE
body of knowledge was researched and correlated with observations of decision making by PHA teams during the performance of studies to explain and understand
EP
why PHA teams may make erroneous decisions. The application of cognitive psychology to decision making in PHA has not been addressed previously.
AC C
PHA facilitators must understand the impact of cognitive biases on PHA studies
because they can seriously impact the quality of study results. Hazard scenarios may be missed, risks estimated incorrectly, and important recommendations for risk reduction omitted. This paper discusses cognitive biases that have been observed during the performance of PHA studies, gives examples of their effects, and provides
1
ACCEPTED MANUSCRIPT
guidelines to minimize their adverse impacts.
1.0
RI PT
Key words: Process hazard analysis, process safety, heuristic, cognitive bias.
Introduction
Process hazard analysis (PHA) studies are performed by teams of people to
SC
address failures in processes that can result in hazard scenarios with adverse impacts on receptors such as people, property and the environment [CCPS, 2008; Baybutt,
M AN U
2013]. During the performance of studies, PHA teams must make many decisions. Observations of PHA teams indicated that decisions are not always made logically and it was concluded that psychological factors were at play and were influencing decisions and the results of studies.
D
Several theories of decision making have been described in the literature [Plous,
TE
1993; Baron, 2008, Saaty and Peniwati, 2008; Hardman, 2009; Hastie and Dawes, 2010] and the dynamics of group decision making has been addressed [Levi, 2014].
EP
However, their application to PHA has received little attention. One study looked at some of the psychological processes involved in hazard and operability (HAZOP)
AC C
studies, specifically, interactions between team members and how team members perceive, remember, judge and reason [Leathey and Nicholls, 1998]. More recently, human factors that influence the performance of PHA studies have been addressed [Baybutt, 2013]. However, no analysis has been performed of how psychological factors may influence the decisions made during PHA studies. Research into cognitive biases
2
ACCEPTED MANUSCRIPT
that have been identified by psychologists allowed explanations to be posited for the observations made of PHA team decision making. Decisions are made by people before, during, and after the performance of a
RI PT
PHA study. This paper focuses on decisions that are made by the PHA team during study sessions and examines the role of heuristics and other cognitive biases. Possible cognitive biases that can affect PHA studies are identified and described. Their impact
SC
on PHA is discussed and recommendations are provided on how to minimize their adverse impacts.
M AN U
Section 2 identifies key decisions that are made in PHA that may be adversely impacted by cognitive biases. Section 3 contains a discussion of decision making and the role played by heuristics and cognitive biases. Section 4 describes the extent to which cognitive biases in decision making and PHA can be mitigated. Section 5
D
discusses how heuristics and cognitive biases that have been recognized by
TE
psychologists correlate with observations of decision making in PHA studies, provides examples of their impacts, and proposes ways to minimize their adverse impacts.
EP
Overall recommendations to address cognitive biases in PHA are presented in Section
2.0
AC C
6 and conclusions are drawn in Section 7.
Key PHA Decisions
In order to examine the effect of psychological factors on decisions made during
PHA studies, the types of decisions must be identified. PHA studies identify hazard scenarios for processes and may develop recommendations for risk reduction measures. In doing so, key decisions are made by answering these questions: 3
ACCEPTED MANUSCRIPT
Which aspects of design intent should be studied?
What design representations should be consulted?
Which deviations from design intent should be addressed?
Which initiating events are credible as causes of hazard scenarios?
What multiple failures are credible?
What consequences occur for hazard scenarios?
What safeguards are present, which of them can be credited for each hazard
SC
RI PT
M AN U
scenario, and how much credit should be taken? What enablers apply to hazard scenarios?
What is the severity and likelihood of hazard scenario consequences?
What human factors issues affect hazard scenario risks?
What siting issues affect hazard scenario risks?
What recommendations are possible to reduce risk?
TE
D
Decisions made in answering these questions influence the completeness of
EP
scenario identification and the risk that is tolerated for a process. All these decisions
3.0
AC C
can be affected by psychological factors.
Decision Making, Heuristics, and Cognitive Biases
People may be expected to make important decisions rationally. However,
cognitive processes can operate to impede rationality. Psychologists have established that people tend to make decisions based on cognitive factors rather than factual evidence and cognitive biases occur [Pohl, 2004; Kahneman, 2011]. Cognitive biases 4
ACCEPTED MANUSCRIPT
are unconscious, automatic influences on human judgment and decision making that can cause reasoning errors; distort perceptions, interpretations, and judgments; and produce irrational decisions [Haselton, Nettle and Andrews, 2005]. They arise from
RI PT
various mental processes that can be difficult to distinguish, including
information-processing shortcuts and motivational and social factors. Many cognitive biases have been documented in the literature and their nature and causes described.
SC
Cognitive biases occur commonly.
Psychologists have debated the meaning of human rationality and it has been
M AN U
theorized that cognitive biases actually have evolved to produce optimum decisions within specific decision domains [[Haselton, Nettle and Andrews, 2005]. From this perspective, cognitive biases are not the result of cognitive constraints or irrationalities and, therefore, they can be viewed not as design flaws of the brain but rather design
D
features that have evolved through adaptation. Regardless of the theoretical
TE
perspective, cognitive biases can produce sub-optimal results for some types of decisions. Their adverse impact on decisions made during the performance of PHA
EP
studies is the focus of this paper.
Both the emotional and rational parts of the brain are involved in decision
AC C
making. The emotional brain acts instinctually, effortlessly, and quickly while the rational brain acts consciously, deliberately, and slowly. People make tradeoffs, usually subconsciously, between the effort involved and the quality needed for a decision in using the emotional and rational parts of brain. Thus, the human mind has evolved mental shortcuts to deliver fast, reasonable decisions to real-world problems, often based on limited information in what is known as bounded rationality [Simon, 1957]. 5
ACCEPTED MANUSCRIPT
The shortcuts are called heuristics [Kahneman, Slovic and Tversky, 1982; Gigerenzer and Todd, 1999; Kahneman and Tversky, 2000; Gilovich, Griffin and Kahneman, 2002; Gigerenzer, 2007; and Lehrer, 2009]. The term derives from a Greek word meaning
RI PT
"find" or "discover". Heuristics are simple rules governing judgment or decision making and provide experience-based ways for solving problems. They are learned or encoded in people through evolutionary processes. Examples include using a rule of thumb, an
SC
educated guess, an intuitive judgment, stereotyping, profiling, and common sense. Often, they are used by people when facing complex problems or incomplete
M AN U
information. They speed decision making, simplify the process, reduce the cognitive effort involved, and avoid the need for more comprehensive thinking. Invariably, people are not aware they are using them.
Reasonable quality often can be achieved with heuristics and they are used
D
frequently. Typically, they focus on one aspect of a problem and ignore others. While
TE
they can work well, they also produce cognitive biases that lead to systematic errors
4.0
EP
and erroneous decisions.
Mitigation of Cognitive Bias
AC C
Unfortunately, there is no comprehensive theory or practice of cognitive bias
mitigation. Cognitive biases are difficult to detect and override because they are used unconsciously and automatically. Even those aware of their existence are unable to detect bias in their decisions when it occurs. Consequently, mitigation of cognitive biases poses challenges. However, it is important that people understand the role of cognitive biases in decision making so they are aware of the potential for poor 6
ACCEPTED MANUSCRIPT
decisions. Some overall recommendations are proposed for mitigating cognitive bias in PHA studies. In a team environment, although team members may not be able to
RI PT
control their own use of cognitive biases, awareness may result in recognition of their influence on other team members, thus providing an opportunity to address them.
minimize their impact on studies conducted by teams.
SC
Certainly, PHA facilitators should be aware of their importance and know how to
Some people may be more pre-disposed than others to display certain cognitive
M AN U
biases. Therefore, selection of team members on this basis may be possible. However, screening participants may be difficult. Furthermore, a limited choice of team members also may be a factor. Focused training within a particular domain may go some way to alleviating the impact of cognitive biases because specific knowledge may overcome
D
cognitive bias. PHA facilitators should encourage teams to look not just for evidence to
TE
confirm expressed views but also evidence to the contrary. The impact of cognitive biases on PHA studies can be minimized by using as much tangible data and
cognitive biases.
EP
information as possible to avoid the need to rely on opinions, which may be tainted by
AC C
The use of a devil’s advocate as a PHA team member can address many
cognitive biases. A devil's advocate challenges and debates views offered by others in order to help determine their validity. Devil’s advocates actually may agree with the views offered but their role is to challenge them, possibly even by taking an opposing position. Devil’s advocates can become domesticated such that their objections become 7
ACCEPTED MANUSCRIPT
routine and token in nature. Also, devil’s advocates can become unpopular. Both issues can be addressed by switching the role periodically among team members. Of course, individuals selected must be willing and capable of playing the role effectively. They
RI PT
must be perceived by the team as credible in the role, be comfortable with controversy, avoid being perceived as argumentative, and be able to recognize when further debate would not be worthwhile. Also, they should not allow the role to interfere unduly with
SC
their other contributions to a study. PHA facilitators may need to act as a devil’s
advocate for suggestions made by team members. They should ensure that all team
M AN U
members understand the role of the devil’s advocate in order to decrease the likelihood that their interventions will be perceived negatively.
5.0
Discussion of Heuristics and Cognitive Biases and Their Impacts on
D
Process Hazard Analysis
TE
This section discusses how heuristics and cognitive biases that have been identified by psychologists can explain observations of decision making by PHA teams.
EP
Observations are correlated with recognized cognitive biases and used to understand why PHA teams may make erroneous decisions. The insights obtained have been used
AC C
to devise ways to try and avoid erroneous decisions that result from heuristics and cognitive biases.
The analysis is organized by cognitive bias. Each bias is described, its possible
effects on PHA are discussed including examples of situations that have been observed during PHA sessions, and proposals are made for minimizing its impacts.
8
ACCEPTED MANUSCRIPT
5.1
Anchoring Heuristic Anchoring is the human tendency to rely too heavily on the first piece of
RI PT
information offered (the "anchor") when making decisions. Individuals use the anchor to make subsequent judgments by adjusting away from the anchor but there is a bias
phenomenon. Even experts are susceptible.
SC
towards the anchor. Anchoring is difficult to avoid, even if people are aware of the
Anchoring can affect a PHA study if the facilitator makes suggestions to the
M AN U
team, for example, relating to deviations to consider, the credibility of scenario causes, or scenario risk estimates. The suggestions may serve as anchor points for the team and bias their views. Clearly, such views should not be expressed by the PHA facilitator until the team has expressed their own views. Of course, team members may also offer
D
anchors. PHA facilitators should be alert to this possibility and, whenever possible, use
Availability Heuristic
EP
5.2
TE
objective data and information to validate suggestions made by the team.
Availability refers to the ease with which a particular idea can be brought to mind.
AC C
People may make a judgment based on how easily they can think of something similar. This heuristic can impact PHA in several significant ways. Arguably, the most
important decision that must be made by PHA teams is which deviations from design intent will be addressed in a study. For example, in the HAZOP study method, this decision is made when teams choose aspects of design intent, typically called
9
ACCEPTED MANUSCRIPT
parameters, which are combined with guide words to generate deviations [[CCPS, 2008; Baybutt, 2013]. If the team does not address all important aspects of design intent, hazard scenarios will be missed [Baybutt, 2016]. The decision is complex as the
RI PT
team effectively must look into a crystal ball and try to predict which deviations will
result in significant scenarios without actually performing the analysis. Novice team members may have no examples to call upon and likely will rely on the opinions of more
SC
experienced team members. However, experienced team members will be limited by their experience as most team members only occasionally participate in PHA studies
M AN U
and only for the processes with which they work. Thus, teams frequently focus on process parameters with which they are familiar, such as flow, temperature, and pressure, but do not explore other important aspects of design intent that can help identify, for example, important human failures by operators and mechanics, unless
D
prompted to do so. The problem is compounded by the need to address the issue for
TE
every node. The team easily can fall into the trap of using a small set of similar parameters for each node to avoid the need for intellectual exertion. Thus, the
address.
EP
availability heuristic likely will play a significant role in decisions on parameters to
AC C
One way of addressing the issue is to brief team members on the full spectrum
of deviations that have proven important in previous studies for all processes operated by the company before embarking on a study. The briefing should include a description of incidents that have occurred in company processes but cast in terms of PHA scenarios rather than root cause analysis, including the identification of the deviations from design intent that caused them. Such a briefing will help to provide relevant 10
ACCEPTED MANUSCRIPT
examples for teams to reference in their brainstorming of parameters during the study. Teams also must identify initiating events for scenarios. They may be single or multiple failures. The identification of initiating events depends largely on the knowledge
RI PT
and experience of team members and people’s memories to be able to recall information. However, often team members are not as knowledgeable of or
experienced with multiple failure events because usually they occur less frequently than
SC
single failure events. Consequently, the availability heuristic can operate to cause team members to dismiss the possibility of multiple failure events altogether because they
M AN U
cannot think of examples. The issue can be addressed in a similar way to that for parameters by briefing the team on actual multiple failure events that have occurred. The availability heuristic plays a particularly important role in the estimation of the likelihoods of scenarios. When an infrequent event can be brought easily and vividly
D
to mind, this heuristic leads to overestimates of its likelihood. For example, people
TE
overestimate their likelihood of dying in a dramatic event such as an earthquake because such events usually are highly publicized and therefore have a higher
EP
availability. In contrast, for more routine events, such as fatalities from automobile accidents, it is harder to bring specific cases to mind, so their likelihoods tend to be
AC C
underestimated. Personal experience with events that cause fatalities also markedly increases availability and causes overestimates of the likelihood of similar future events. Similarly, lack of personal experience with events causes underestimates of their likelihood. This heuristic can be addressed by referring to failure data on the events that contribute to hazard scenarios. In particular, the use of applicable failure data with layers of protection analysis can improve the objectivity of scenario likelihood 11
ACCEPTED MANUSCRIPT
estimates. Lack of awareness by team members of human and siting factors can lead to their neglect in studies. If team members do not have instances or examples to recall
RI PT
where they were important, the availability heuristic will cause them to believe there are none. Training for PHA team members in human factors and siting issues can help to address this issue. Also, teams should be briefed on the contributions of such factors to
SC
actual incidents that have occurred.
The availability heuristic also can influence recommendations that are made for
M AN U
risk reduction. Most PHA team members are very familiar with risk reduction measures involving the addition of engineered safeguards but are less familiar with other approaches, such as inherently safer designs. Consequently, recommendations from PHA studies often do not include the use of inherently safer technologies. Thus, training
5.3
TE
D
and education of team members in all means of risk reduction is important.
Confirmation and Other Forms of Bias
EP
Bias is holding a particular view of a matter at the expense of other, possibly equally-valid or better, alternatives. Biased individuals have an inclination or prejudice
AC C
for or against a particular view. They lack a neutral viewpoint and do not have an open mind. Bias may be accompanied by a disinclination to consider the possible merits of alternative points of view and comes in many forms. Confirmation bias is the tendency for people to make judgments that confirm
their preconceptions. Information may be remembered selectively or interpreted in a biased way. For example, some PHA team members, such as the process designer or 12
ACCEPTED MANUSCRIPT
process engineer, may be predisposed to the view that the process is adequately safe. This view may cause them to forget safety issues that have occurred or interpret them prejudicially. Similarly, team members may dismiss the validity of a recommendation for
RI PT
risk reduction based on prior but prejudiced views of its efficacy.
People may be biased towards optimism or pessimism causing them to
underestimate or overestimate, respectively, the likelihood of a negative event, and to
SC
do the opposite for positive events. These forms of bias can cause PHA team members to underestimate or overestimate failure data, such as initiating event frequencies, the
M AN U
amount of credit that should be taken for scenario safeguards, and the severity and likelihood of hazard scenarios.
People may interpret and judge situations according to their own culture. This form of bias can create problems when a PHA study is performed by people from one
D
culture for a process that will be operated by people from a different culture. For
TE
example, a PHA team may believe that written operating procedures always will be followed by operators because that is the expectation of the team’s culture. However,
EP
the people who will operate the process may not subscribe to this standard of practice and, indeed, they may routinely ignore written procedures in favor of a preferred form of
AC C
operation. In such a case, the PHA study team may perform a good quality PHA but on a process that does not actually exist. Important hazard scenarios for the actual process operation will be missed. The use of a devil’s advocate can help address bias. Bias can also be addressed
by avoiding dependence on opinions of team members whenever possible by taking time to check available data and information as “sanity” checks on the opinions. 13
ACCEPTED MANUSCRIPT
5.4
Conformity and Peer Pressure People can feel inclined to conform to the view of a group to which they belong,
RI PT
even if it is wrong. They choose not speak up with a dissenting view. There are
similarities with peer pressure in which the views of a group influence an individual
member of the group to change their opinions. The collective voice of a group can mask
SC
and oppress the view of an individual. Pressure to adopt the group view can be
overpowering. Some individuals may relish going against the group view but such
M AN U
individuals are not common.
Conformity and peer pressure can affect any decision during a PHA study. It is valuable if PHA facilitators are able to recognize when the views of a team member are being suppressed so they can intervene and encourage the team member to share
D
their views. Furthermore, team members likely will be more willing to share their
TE
dissenting views if they believe they will be supported in doing so. Such support can be
5.5
EP
provided by the PHA facilitator or by a devil’s advocate.
Framing Effect
AC C
Framing effects occur when equivalent descriptions of a situation lead to
systematically different decisions. Several types have been identified including attribute framing, risky choice framing, and goal framing. In attribute framing, a single attribute of a situation is described in terms of
either intrinsic attractiveness of proportion or an equivalent aversiveness of proportion.
14
ACCEPTED MANUSCRIPT
Situations described in terms of an intrinsic attractiveness of proportion are generally evaluated more favorably than those described in terms of the corresponding aversiveness of proportion. For example, decisions framed in terms of successes rather
RI PT
than failures are more likely to be accepted. Thus, a proposed risk reduction measure suggested by a PHA team member that is said to be 99% reliable rather than 1% unreliable is more likely to be accepted by the team.
SC
In risky choice framing, people base decisions on whether choices are presented as a loss or a gain. People tend to avoid risk when a choice is framed negatively as a
M AN U
loss but accept risk when it is framed positively as a gain. For example, a person may choose no surgery if told it has a 10% failure rate, but opt for surgery if told it has a 90% success rate. Of course, the risk is the same in both cases but it is framed differently. Thus, posing a question to a PHA team such as, “Will 90% of the relief valves operate
D
successfully during the process lifetime?”, may elicit a different answer than, “Will 10%
TE
of the relief valves fail during the process lifetime?”. In goal framing, people are encouraged to adopt a particular viewpoint by
EP
describing either the advantages of adopting the viewpoint or the disadvantages of not adopting the viewpoint. The viewpoint will most likely be accepted when the
AC C
disadvantages of not adopting the viewpoint rather than the advantages of adopting the viewpoint are described. Many people prefer to emphasize advantages rather than disadvantages so this frame may not be used frequently by PHA team members. However, it may be used by a PHA facilitator in managing the team. For example, the facilitator may believe that a particular hazard scenario merits recording in the PHA worksheet but team members are not convinced of its significance. The facilitator could 15
ACCEPTED MANUSCRIPT
emphasize the advantage that the PHA study will be more complete if it is included or the disadvantage that lives may be lost if it is not included. The latter argument is more likely to prevail.
RI PT
PHA facilitators should be careful to avoid inappropriate use of frames, for
example, when posing questions to team members and summarizing discussions. PHA facilitators also should be alert to other team members framing issues and intervene to
Group Polarization
M AN U
5.6
SC
re-phrase the issue in neutral terms.
Group polarization refers to the tendency for a group to make decisions that are more extreme than the initial inclinations of its members. Decisions can be more risky or more cautious. The phenomenon is of most concern when a group tends to be more
D
accepting of risk than an individual and in such cases it is called a risky shift. It occurs
TE
because individuals in a group enjoy a measure of anonymity which can relieve them of feelings of blame or responsibility for a risky decision.
EP
This heuristic can affect key decisions in a PHA, such as the credibility of initiating events, how much credit to take for safeguards, and what scenario severities
AC C
and likelihoods should be assigned. PHA facilitators should be alert to group polarization so they can query the team’s view and help the team develop a more objective perspective. PHA facilitators must stay aloof from the brainstorming that occurs during a PHA study and not become part of the team; otherwise their ability to recognize the occurrence of the phenomenon may be impaired. Also, a devil’s advocate can help address group polarization. 16
ACCEPTED MANUSCRIPT
5.7
Groupthink Groupthink is a phenomenon in which a group of people share common but
RI PT
possibly false beliefs and think and make decisions in the same way, thus discouraging creativity. It can be thought of as a collective mindset. Usually, it occurs when people in the group have worked together for a period of time and evolved consensus views
SC
based on shared experiences, which become unrecognized assumptions. Groupthink has also been described as a phenomenon in which group members try to minimize
M AN U
conflict and reach a consensus decision without critical evaluation of alternative viewpoints by actively suppressing dissenting viewpoints and by isolating themselves from outside influences.
Groupthink can affect many decisions within a PHA study. For example,
D
preventive maintenance of equipment is often credited as a safeguard in PHA studies.
TE
Such credit should not be taken without input from personnel in the maintenance department who might disabuse team members of their assumption of a perfectly
EP
functioning PM program.
Groupthink within a PHA team can be addressed by the participation of an
AC C
independent senior engineer as a PHA team member who does not have any prior experience with the particular process being studied. Such a person can challenge assumptions made by other team members and contribute knowledge that may not be possessed by the team. This role can be played by an independent experienced PHA facilitator if care is exercised to avoid direct confrontation with team members. The role has similarities to a devil’s advocate. 17
ACCEPTED MANUSCRIPT
5.8
Mindsets Mindsets are assumptions held by an individual which are so established that the
RI PT
individual does not recognize they exist and continues to accept prior choices as valid. Usually, they arise after an individual has worked with a process for some time. Current methods of working and existing levels of safety become viewed as acceptable and are
SC
no longer questioned. Therefore, it can be difficult to see the process in a fresh light. Mindsets can affect all decisions made in PHA studies. For example, a process
M AN U
designer participating in a PHA study may believe that there is no need for isolation valves in the process based on previous practices. However, when the safety engineer describes the consequences of the operators being unable to isolate lines and vessels in the process, the design engineer may experience a revelation.
D
PHA team members can help each other overcome mindsets. The study
TE
facilitator should be on the lookout for them. Also, a devil’s advocate can be employed to address mindsets. The key is to be able to provide an epiphany for the individual
Representative Heuristic
AC C
5.9
EP
subject to a mindset so they can see the issue in a new light.
People may make a judgment based on how much a new situation resembles a
situation with which they are familiar. The representative heuristic may lead people astray when the situations are not the same but the difference is not recognized. PHA teams may assume aspects of a study for a process are the same as for a
18
ACCEPTED MANUSCRIPT
previously-studied process when important differences are not recognized or understood. For example, a PHA study team may be performing a PHA on a process that is a more automated version of a process for which a PHA has already been
RI PT
performed. It would be incorrect to assume the same design representations should be used for both studies or should receive the same attention. The operating procedures likely would have received particular attention in the first study where the process has
important for the more automated process.
SC
higher human involvement while cause and effect diagrams would be particularly
M AN U
In another example, the likelihood of some initiating events for scenarios can vary substantially from one process to another, for example, the likelihood of external events at different facilities. The representative heuristic may cause this difference not to be recognized. Similarly, a hazard scenario that was judged to be of low likelihood for
D
one process may be of high likelihood for another process due to, for example, plant
TE
aging factors. A further example is provided by a PHA team performing a study on a process at a facility with high operator involvement when they have performed a study
EP
on a similar process previously for a different facility. The representative heuristic can cause the team to assume that the human factors issues identified in the previous PHA
AC C
also apply to the current PHA. However, differences in procedures, training, and plant or team culture, can cause marked differences. Within a PHA study, it is common to copy entries from one node to another or
from one scenario to another, when they are deemed to be similar, with the intention of identifying any differences and editing the entries accordingly. For example, safeguards may be copied from one scenario to another scenario judged to be similar. However, 19
ACCEPTED MANUSCRIPT
some of the safeguards may not apply owing to scenario differences that are not fully appreciated. If credit is taken for them, scenario risk will be underestimated. Similarly,
not recognized, scenario risk will be estimated incorrectly.
RI PT
the amount of credit that should be taken for copied safeguards may differ and, if this is
The availability heuristic also can cause repetition of risk reduction
recommendations for hazard scenarios deemed sufficiently similar when there may be
SC
important but subtle differences in the scenarios that would merit different and more effective recommendations.
M AN U
PHA facilitators can address this heuristic by focusing the attention of the team on the differences rather than the similarities between situations. Also, encouraging the team to look at situations from a different viewpoint may help them to see the
Satisficing
TE
5.10
D
differences. A devil’s advocate also can help.
The term is a combination of the words “satisfy” and “suffice” and describes a
EP
situation where people make judgments that are good enough for their purposes but could be improved [Simon, 1957]. The concept was used in a theory of bounded
AC C
rationality of decision making wherein a decision maker cannot determine an optimal solution owing to limits on rationality imposed by the lack of sufficient cognitive resources by decision makers, the cognitive limitations of their minds, and the finite time available to make a decision. People may search for one good reason for making a decision and stop searching for further information when one has been found. Similarly, people may search for cues in memory until one can be found on which to 20
ACCEPTED MANUSCRIPT
base a decision. Virtually all decisions in PHA studies may be impacted adversely by satisficing. The principal reasons are time pressures on the team and limitations in the cognitive
RI PT
resources of the team members, primarily their ability to maintain concentration for extended time periods performing an activity that often is viewed as boring and
repetitive. Management should try to provide as much time as possible for PHA teams
SC
to perform a study and not put teams under time pressures. PHA facilitators should avoid scheduling lengthy sessions, avoid spending undue time on minor issues, and
M AN U
manage repetition in studies effectively. A devil’s advocate also can help address satisficing.
PHA facilitators must ensure that PHA team members understand that there is not an acceptable minimum level of performance for a study but rather that the team
D
must strive to perform a comprehensive and complete study. Thus, perfection is the
TE
performance goal, and, although it will never be achieved, all efforts must be exerted to try and reach it. The consequences of not achieving the goal must be emphasized, that
EP
is, missed scenarios resulting in fatalities, possibly of team members themselves, and other adverse impacts on equipment, property, the environment, and the company.
AC C
Ideally, team members must believe their own best interests are invested in the outcome of a study.
A catalog of cognitive biases that can adversely impact PHA studies is available
[tech.primatech.com/1/jlp1].
6.0
Overall Recommendations for Addressing Cognitive Biases in PHA 21
ACCEPTED MANUSCRIPT
Many cognitive biases can be addressed by: ·
Ensuring the PHA facilitator understands the importance of cognitive biases, knows how they can affect PHA studies, and is capable of addressing them for
·
RI PT
team members.
Ensuring team members are aware of the phenomenon so they may be able to recognize cognitive biases in the positions taken by other team members and
Training team members within particular domains so that knowledge overcomes cognitive bias.
·
M AN U
·
SC
intervene appropriately.
Encouraging team members to adopt an attitude of healthy skepticism towards views expressed by other team members.
·
Encouraging teams to look not just for evidence to confirm expressed views but
Focusing team attention on differences rather than similarities between
TE
·
D
also evidence to the contrary.
situations.
Encouraging teams to look at situations from a different viewpoint.
·
Creating an environment in which dissenting views are sought and respected.
·
Using as much tangible data and information as possible to avoid the need to
AC C
EP
·
rely on opinions, which may be tainted by cognitive biases.
·
7.0
Employing a devil’s advocate to challenge positions of team members.
Conclusions Observations of PHA teams conducting studies indicated that decisions made 22
ACCEPTED MANUSCRIPT
during the studies are not always rational. It has been established by cognitive psychologists that humans have evolved heuristics, or mental shortcuts, that govern judgment and decision making. People employ heuristics subconsciously and are
RI PT
unaware of their use. Unfortunately, cognitive biases and erroneous decisions can result, including in PHA. This body of knowledge was researched to correlate the
observations of decisions made by PHA teams with recognized cognitive biases and
SC
was used to understand why PHA teams may make erroneous decisions. Many
cognitive biases can have adverse impacts on the results of PHA studies. Hazard
M AN U
scenarios may be missed, risks estimated incorrectly, and important recommendations for reducing the risk of catastrophic accidents may be omitted. There is no comprehensive approach for mitigating cognitive biases. Individual self-awareness of the influence of cognitive biases on judgments and decisions
D
essentially is not possible. Specific ways were devised and described for addressing
TE
each of the cognitive biases identified in the paper in PHA studies together with some overall recommendations. The results of the work described in this paper and the
EP
guidelines provided can be used to improve decision making in PHA studies and reduce the chance of erroneous decisions with their attendant potentially calamitous
AC C
consequences. Use of the results of this work will help PHA facilitators and team members understand the impact of cognitive biases on PHA studies and minimize their adverse impacts.
References 1.
CCPS (2008), Guidelines for Hazard Evaluation Procedures, 3 rd Edition, Center 23
ACCEPTED MANUSCRIPT
for Chemical Process Safety / American Institute of Chemical Engineers. Baybutt, P., (2013) Analytical Methods in Process Safety Management and System Safety Engineering – Process Hazards Analysis, in Handbook of Loss Prevention Engineering, J. M. Haight (ed), Wiley-VCH, Weinheim, Germany.
3.
Plous, S. (1993) The Psychology of Judgment and Decision Making, McGraw-Hill, New York, NY.
4.
Baron, J. (2008) Thinking and Deciding, 4th edition, Cambridge University Press, New York, NY.
5.
Saaty T. L. and Peniwati K., (2008) Group Decision Making: Drawing Out and Reconciling Differences, RWS Publications, Pittsburgh, PA.
6.
Hardman, D. (2009) Judgment and Decision Making: Psychological Perspectives, Wiley Blackwell, Chichester, UK.
7.
Hastie R. and Dawes R. M., (2010) Rational Choice in an Uncertain World, 2 nd Edition, Sage Publications, Thousand Oaks, CA.
8.
Levi, D. (2014) Group Dynamics for Teams, 4th Edition, Sage Publications, Thousand Oaks, CA.
9.
Leathey, B. and Nicholls, D., (1998) Improving the effectiveness of HAZOP: A psychological approach, Loss Prevention Bulletin, 139, pages 8 - 11.
10.
Baybutt, P. (2013) The role of people and human factors in performing process hazard analysis and layers of protection analysis, J. of Loss Prevention in the Process Industries, Vol. 26, pages 1352-1365.
11.
Pohl, R. F. (2004) Cognitive Illusions: A Handbook on Fallacies and Biases in Thinking, Judgement and Memory, Psychology Press, Hove, UK.
12.
Kahneman, D. (2011) Thinking, Fast and Slow, Farrar, Straus and Giroux, New York, NY.
14.
SC
M AN U
D
TE
EP
AC C
13.
RI PT
2.
Haselton, M. G., Nettle, D. and Andrews, P. W. (2005) The evolution of cognitive bias, The Handbook of Evolutionary Psychology, D. M. Buss (Ed), John Wiley, Hoboken, NJ. Simon, H. (1957) A behavioral model of rational choice, in Models of Man, Social and Rational: Mathematical Essays on Rational Human Behavior in a Social Setting, Wiley, New York, NY. 24
ACCEPTED MANUSCRIPT
15. 16.
Kahneman, D., Slovic P. and Tversky, A. (eds) (1982), Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press, Cambridge, UK. Gigerenzer, G. and Todd, P. M. (1999) Simple Heuristics That Make Us Smart, Oxford University Press, Oxford, UK.. Kahneman D. and Tversky, A. (eds) (2000), Choices, Values, and Frames, Cambridge University Press, Cambridge, UK.
18.
Gilovich, T., Griffin, D. W. and Kahneman D. (eds) (2002) Heuristics and Biases: The Psychology of Intuitive Judgment, Cambridge University Press, Cambridge, UK.
19.
Gigerenzer, G. (2007) Gut Feelings: The Intelligence of the Unconscious, Viking, New York, NY.
20.
Lehrer, J. (2009) How We Decide, Houghton Mifflin Harcourt, New York, NY.
21.
Baybutt, P. (2016) Design intent for hazard and operability (HAZOP) studies, Process Safety Progress, Volume 35, Issue 1, pages 36–40.
AC C
EP
TE
D
M AN U
SC
RI PT
17.
25
ACCEPTED MANUSCRIPT
PHA teams were observed to make irrational decisions that can increase the risks of catastrophic accidents. These observations were correlated with cognitive biases recognized by psychologists.
AC C
EP
TE
D
M AN U
SC
RI PT
Guidance was developed to reduce the adverse impacts of cognitive biases on PHA studies.