IFAC Conference Manufacturing Modelling, Management and on Control IFAC Manufacturing Modelling, Management and on Control IFAC Conference Conference on Manufacturing June 28-30, 2016. Troyes, France Modelling, IFAC Conference on Manufacturing Modelling, Management and Available online at www.sciencedirect.com June 28-30, 2016. Troyes, France Management and Control Control Management and Control June 28-30, 2016. Troyes, France June 28-30, 2016. Troyes, France June 28-30, 2016. Troyes, France
ScienceDirect
IFAC-PapersOnLine 49-12 (2016) 1673–1678 A Hybrid Model for Human Error Probability Analysis A Hybrid Model for Human Error Probability Analysis A Hybrid Model for Human Error Probability Analysis A Error Probability A Hybrid Hybrid Model Model for forF.Human Human Error Probability Analysis Analysis De Felice*. A. Petrillo**
F. F. DeZomparelli*** Felice*. A. Petrillo** F. De DeZomparelli*** Felice*. A. Petrillo** Petrillo** F. Felice*. F. F. De Felice*. A. A. Petrillo** F. Zomparelli*** Zomparelli*** F. F. Zomparelli*** *University of Cassino and Southern Lazio, Cassino, Italy *University of Cassino and Southern Lazio, Cassino, Italy (e-mail:
[email protected]) *University of Cassino and Southern Lazio, Cassino, Italy *University of Cassino and Southern Cassino, Italy (e-mail:
[email protected]) *University of Cassino and Southern Lazio, Lazio,Napoli, Cassino, Italy **University of Naples “Parthenope”, Italy (e-mail:
[email protected]) (e-mail:
[email protected]) **University of Naples “Parthenope”, Napoli, Italy (e-mail:
[email protected]) (e-mail:
[email protected]) **University of Naples “Parthenope”, Napoli, Italy **University of “Parthenope”, Napoli, Italy (e-mail:
[email protected]) **University of Naples Naples “Parthenope”, Napoli, ItalyItaly ***University Cassino and Southern Lazio, Cassino, (e-mail:
[email protected]) (e-mail:
[email protected]) ***University of Cassino and Southern Lazio, Cassino, Italy (e-mail:
[email protected]) (e-mail:
[email protected]) ***University of Cassino and Southern Lazio, Cassino, Italy ***University of Cassino and Southern Lazio, Cassino, Italy
[email protected]) ***University(e-mail: of Cassino and Southern Lazio, Cassino, Italy (e-mail:
[email protected]) (e-mail:
[email protected]) (e-mail:
[email protected])
Abstract: The aim of the present research is to propose a hybrid model for human error probability analysis based Abstract: TheReliability aim of theand present research is toMethod propose(CREAM) a hybrid model human error probability analysis based on Cognitive Error Analysis and for Systematic Human Error Reduction and The aim of the present research is to propose aa hybrid model for human error probability analysis based Abstract: The aim of the present research is to propose hybrid model for human error probability analysis based Abstract: on Cognitive Reliability and Error Analysis Method (CREAM) and Systematic Human Error Reduction and Prediction Approach (SHERPA) Method. The model aims to provide a theoretical framework to understand human The aim of the present research is to propose a hybrid model for human error probability analysis based Abstract: on Cognitive Reliability and Error Analysis Method (CREAM) and Systematic Human Error Reduction and
on Cognitive and Error Analysis Method (CREAM) and Systematic Human Error Reduction and Prediction Approach (SHERPA) Method. The model model to on provide a theoretical framework to understand human behavior and toReliability predict error This isaims based a fundamental distinction between competence and on Cognitive Reliability andprobability. Error Analysis Method (CREAM) and Systematic Human Error Reduction Prediction Approach (SHERPA) Method. The model to provide aa theoretical framework to understand human Prediction Approach (SHERPA) Method. The model aims to provide theoretical framework to understand human behavior and to predict error probability. This model isaims based on aonfundamental distinction between competence and control that offers a way of describing how performance depends context. The most important result of this work Prediction Approach (SHERPA) Method. The model aims to provide a theoretical framework to understand human behavior and to predict error probability. This model is based on a fundamental distinction between competence and behavior and predict error probability. This model based on fundamental distinction between competence and control that a way of describing performance depends context.inThe most important of this work is to provide ato that can be used inhow the preventive of accident order to mitigate theresult damages. behavior andoffers tomodel predict error probability. This model is isphase based onanaaon fundamental distinction between competence and control that offers a way way of describing how performance depends on context.inThe The most important result of this this work work control that offers a of describing how performance depends on context. most important result of is to provide a model that can be used in the preventive phase of an accident order to mitigate the damages. control that offers a way of describing how performance depends on context. The most important result of this work is to provide a model that can be used in the preventive phase of an accident in order to mitigate the damages. Keywords: Human error, safety, disaster management, risk analysis © to2016, IFAC (International Federation ofpreventive Automatic Control) Hosting by Elsevier Ltd. Allthe rights reserved. is provide aa model that be in phase of accident damages. is to provide Human model error, that can cansafety, be used used in the the preventive phaserisk of an ananalysis accident in in order order to to mitigate mitigate the damages. Keywords: disaster management,
Keywords: Human error, safety, disaster management, risk analysis Keywords: Human analysis Keywords: Human error, error, safety, safety, disaster disaster management, management, risk riskimportant analysis methods of the second generations are: CREAM 1. INTRODUCTION important methods of1998) the second generations Method (Hollnagel, and A Techniqueare: forCREAM Human 1. INTRODUCTION important methods of the second generations are: CREAM important methods of the second generations are: CREAM Method (Hollnagel, 1998) and A Technique for Human 1. INTRODUCTION INTRODUCTION important methods of the second generations are: CREAM 1. Error Analysis (ATHEANA) (Cooper et al., 1994; The safety of all countries depends on the continuous and Method (Hollnagel, 1998) and and A A(Cooper Technique for Human 1. INTRODUCTION Method (Hollnagel, 1998) Technique for Human Error Analysis (ATHEANA) et al., 1994; The safety of all countries depends on the continuous and Method (Hollnagel, 1998) and A Technique for Human USNRC, 2000), and SPAR-H (Gertman et al., 2004). It is coordinated operation by a group of infrastructure which Error Analysis (ATHEANA) (Cooper et 2004). al., 1994; 1994; The safety safety of ofoperation all countries countries depends oninfrastructure the continuous continuous and Error Analysis (ATHEANA) (Cooper et al., The all depends on the and USNRC, 2000), and SPAR-H (Gertman et al., It is coordinated by a group of which Error Analysis (ATHEANA) (Cooper et al., 1994; The safety of all countries depends on the continuous and important to note that human error may occur in any stage are defined: “Critical Infrastructure”. The critical USNRC, 2000), 2000), and SPAR-H (Gertman et al., al.,in2004). 2004). It is is coordinated operation by aaInfrastructure”. group of of infrastructure infrastructure which USNRC, and SPAR-H (Gertman et It coordinated operation by group which important to note that human error may occur any stage are defined: “Critical The critical USNRC, 2000), and SPAR-H (Gertman et al., 2004). Itare is coordinated operation by aInfrastructure”. group of infrastructure which of human to information processing, but errors generally infrastructure is“Critical a system, a resource, a process, or a important note that human error may occur in any stage are defined: The critical important to note that human error may occur in any stage are defined: “Critical Infrastructure”. The critical of human information processing, but errors generally are infrastructure is a system, a resource, a process, or a important to note that human error may occur in any stage are defined: “Critical Infrastructure”. The critical found more frequentlyprocessing, on the stages decision making combination of a these whose destruction, of human human information but of errors generally are infrastructure is system,elements, resource, process, or aa of information processing, but errors generally are infrastructure is aa these system, aaa resource, aaa process, or found more frequently on the stages of decision making combination ofunavailability whose destruction, of human information errors generally are infrastructure system,elements, resource, process, or a and executing variousprocessing, actions. Inbutthis context, Human interruption oris weakens normal operation, found more frequently on the stages of decision making combination of these elements, whose destruction, found more frequently on the stages of decision making combination of these elements, whose destruction, and executing various actions. In this context, Human interruption or unavailability weakens normal operation, found more frequently on the stages of decision making combination of these elements, whose destruction, Error Identification (HEI) methods are used to identify safety and many factors of aweakens country normal (Bruzzone et al., and executing executing various actions. In this this context, Human interruption or unavailability unavailability operation, and various actions. In context, Human interruption or normal operation, Error Identification (HEI) methods are safety andthemany factors of aweakens country (Bruzzone et al., and executing actions. In context, Human interruption or weakens normal operation, latent human or various operational errors thatthis mayused ariseto as identify a result 2014). In lastunavailability decades, many studies have been made to Error Identification (HEI) methods are used to identify safety and many factors of a country (Bruzzone et al., Error Identification (HEI) methods are to safety and factors of aa country (Bruzzone et al., latent human or operational errors that mayused arise as identify a result 2014). In themany last decades, many studies have been made to Error Identification (HEI) methods are used to identify safety and many factors of country (Bruzzone et al., of human-machine interactions in complex systems to improve the safety conditions in critical infrastructures. latent human or or operational operational errors that may arise arise as aaand result 2014). In In the the last last decades, many in studies haveinfrastructures. been made made to to latent human errors that may as result 2014). the decades, many studies have been of human-machine interactions in complex systems and to improve safety conditions critical latent human or operational errors that may arise as a result 2014). In the last decades, many studies have been made to identify the casual factors, consequences, and recovery Generally, when emergencies in critical infrastructures are of human-machine interactions in complex complex systems systems and to to improve the safety conditionsin in in critical infrastructures. of human-machine interactions in and improve the safety conditions critical infrastructures. identify the casual factors, consequences, and recovery Generally, when emergencies critical infrastructures are of human-machine interactions in complex and to improve the conditions in critical infrastructures. strategies associated with the consequences, errors (Stantonsystems et al., 2006). analyzed, itwhen issafety considered onlyin the system reliability (De identify the casual factors, and recovery Generally, emergencies critical infrastructures are identify the casual factors, consequences, and recovery Generally, when emergencies in critical infrastructures are strategies associated withcategorized the consequences, errors (Stanton et types, al., 2006). analyzed, is considered system identify the casual factors, recovery Generally, emergencies in the critical infrastructures are HEI methods can be into twoand i.e., Felice andit Petrillo, 2012).only But it needs toreliability consider (De the strategies associated withcategorized the errors errors (Stanton (Stanton et types, al., 2006). analyzed, itwhen is considered considered only the system reliability (De strategies associated with the et al., 2006). analyzed, it is only the system reliability (De HEI methods can be into two i.e., Felice and Petrillo, 2012). But it needs to consider the strategies associated with the errors (Stanton et al., 2006). analyzed, it is considered only the system reliability (De qualitative and quantitative approaches. Various such system reliability in conjunction with the human factor, the HEI methods can be categorized into two types, i.e., Felice and and Petrillo, 2012). But But with it needs needs to consider consider the HEI methods can be categorized into two types, i.e., Felice Petrillo, 2012). it to the qualitative and quantitative approaches. Various such system reliability in conjunction the human factor, HEI methods can be categorized into two types, i.e., Felice and Petrillo, 2012). But it needs to consider the qualitative approaches exist, approaches. including SHERPA and so called Humanin Reliability Analysis (HRA).factor, In fact, qualitative and quantitative Various such system reliability conjunction with the human the qualitative and quantitative approaches. Various such system reliability in conjunction with the human factor, the approaches exist, including SHERPA and so called Human Reliability Analysis (HRA). In fact, qualitative and quantitative approaches. Various such system reliability in conjunction with the human factor, the CREAM. Some relevantexist, research based SHERPA on the above today’s increasingly complex Analysis industrial (HRA). systems In require qualitative approaches including and so called Human Reliability fact, qualitative approaches exist, including SHERPA and so called Human Reliability Analysis (HRA). In fact, CREAM. Some relevant research based on the above today’s increasingly complex industrial systems require qualitative approaches including SHERPA and so called Human Reliability fact, methods is Some following: Di exist, Pasquale etbased al., (2015) proposed highly skilled operators, whoAnalysis need to (HRA). control In several CREAM. relevant research on the above today’s increasingly complex industrial systems require CREAM. relevant research on the above today’s increasingly complex systems require methods is Some following: Di Pasquale etbased al., (2015) proposed highly skilled operators, who industrial need to control several CREAM. Some relevant research based on the above today’s increasingly complex industrial systems require a SHERPA model to estimate human reliability; Su et al., parameters. Human operators have often to perform methods is following: following: Di Pasquale Pasquale et al., al., (2015) Su proposed highly skilledHuman operators, who need need to control several methods is Di et (2015) proposed highly skilled operators, who to control several amethods SHERPA model toa estimate human reliability; et al., parameters. operators have situations, often to that perform is following: Di Pasquale et al., (2015) proposed highly skilled operators, who need to control several (2014) represents computational model to handle complex cognitive tasks, in various the SHERPA model to toa estimate estimate human reliability; reliability; Suhandle et al., al., parameters. Humantasks, operators have situations, often to to that perform aaa SHERPA model human et parameters. Human operators have often perform (2014) represents computational model toSu complex cognitive various the SHERPA model toa estimate human reliability; Suhandle et al., parameters. Humanare operators have often to perform dependence in human reliability analysis. The aim of the automatic devices notin able to realize (Fujita and (2014) represents computational model to complex cognitive tasks, in various situations, that the (2014) represents a computational model to handle complex cognitive tasks, in various situations, that the dependence in human reliability analysis. The aim of the automatic devices are not able to realize (Fujita and (2014) represents a computational model to handle complex cognitive tasks, in various situations, that the study is to develop hybrid model for human reliability Hollnagel, 2004). This implies that human reliability dependence in human human reliability analysis. The aim aim of the the automatic devices are not not able that to realize realize (Fujita and dependence in reliability analysis. The of automatic devices are able to (Fujita and study is to a hybrid model for human reliability Hollnagel, 2004). One This implies human reliability dependence inovercomes human reliability analysis. of the automatic devices are way not able to realize analysis thatdevelop the limitations ofThe theaim SHERPA should be ensured. of avoiding such (Fujita errors isand to study is to develop a hybrid model for human reliability Hollnagel, 2004). This implies that human reliability study is to develop a hybrid model for human reliability Hollnagel, 2004). This implies that human reliability analysis thatdevelop overcomes the limitations of the reliability SHERPA should ensured. One of avoiding such errors to study is to a hybrid model for human Hollnagel, 2004). systems This way implies that human reliability and CREAM developbe (Cacciabue, 2004). Overis analysis that models. overcomes the limitations limitations of the the SHERPA SHERPA should bespecific ensured. One way of avoiding avoiding such errors errors isthe to analysis that overcomes the of should be ensured. One way of such is to and CREAM models. develop specific systems (Cacciabue, 2004). Over the analysis that overcomes the limitations of the SHERPA should be ensured. One way of avoiding such errors is to years, various methods have been developed for the and CREAM models. develop specific systems (Cacciabue, 2004). Over the and models. develop specific systems (Cacciabue, 2004). Over 2.CREAM BRIEF OVERVIEW OF CREAM AND SHERPA years, various methods have been developed for the and CREAM models. develop specific systems (Cacciabue, 2004). Over the Human Reliability Analysis et al.,for 2013). 2. BRIEF OVERVIEW OF CREAM AND SHERPA years, various various methods havestudy been(Yang developed the years, methods have been developed for the METHODS Human Reliability Analysis study et al.,the 2013). 2. BRIEF BRIEF OVERVIEW OVERVIEW OF CREAM CREAM AND SHERPA SHERPA years, methods various methods have been developed for the 2. OF These are divided into two(Yang categories: first METHODS Human Reliability Analysis study (Yang et al., 2013). 2. BRIEF OVERVIEW OF CREAM AND AND SHERPA Human Reliability Analysis study et al., 2013). These methods aregeneration divided into two(Yang categories: the first METHODS Human Reliability Analysis study (Yang et al., 2013). METHODS and the second methods. The main first These methods aregeneration divided into into two categories: categories: the first first 2.1 CREAM Method METHODS These methods are divided two the and the second methods. The main These methods aregeneration divided into two the Rate first 2.1 CREAM Method generation methods are: Technique forcategories: Human Error and the second methods. The main first and the second methods. The main main first 2.1 CREAM CREAM Method Method generation methodsgeneration are: Technique for Human Error Rate 2.1 and the second generation methods. The first Prediction (THERP) (Swain and Guttman, 1983), Accident 2.1 CREAM Method is a technique used in HRA for the CREAM methodology generation methods Technique for Error Rate generation (THERP) methods are: are: Technique for Human Human Error Rate Prediction (Swain and Guttman, 1983), Accident CREAM methodology isthe a technique used the generation methods are: Technique for Human Error Rate Sequence Evaluation(Swain Program (ASEP) 1983), (Swain,Accident 1987), purposes of evaluating probability of in a HRA humanfor Prediction (THERP) and Guttman, CREAM methodology isthe a technique technique used in HRA forerror the Prediction (THERP) (Swain and Guttman, 1983), Accident Sequence Evaluation Program (ASEP) (Swain, 1987), CREAM methodology is a used in HRA for the purposes of evaluating probability of a human error Prediction (THERP) (Swain and Guttman, 1983), Accident Systematic Evaluation Human Error Reduction and Prediction CREAM methodology isthe a technique used in HRA forerror the occurring throughout completion of a specific task (De Sequence Program (ASEP) (Swain, 1987), purposes of evaluating probability of a human Sequence Evaluation Program (ASEP) (Swain, 1987), Systematic Human Error Reduction purposes of evaluating the probability of aa human error occurring throughout completion of a measures specific task (De Sequence Program (ASEP) (Swain, 1987), Approach Evaluation (SHERPA) (Embrey, 1986)and andPrediction Human purposes of evaluating the probability of human error Felice et al., 2013). From such analysis can then Systematic Human Error Reduction and Prediction occurring throughout completion of aa measures specific task task (De Systematic (SHERPA) Human Error Error Reduction andandPrediction Prediction Approach (Embrey, 1986)and occurring throughout completion of specific (De Felice et to al., 2013).the From such analysis can then Systematic Human Reduction Cognition Reliability (HCR) (Hannaman etand al., Human 1984). occurring throughout completion a measures specific task (De be taken reduce likelihood ofof errors occurring within Approach (SHERPA) (Embrey, 1986) Human Felice et al., 2013). From such analysis can then Approach (SHERPA) (Embrey, 1986) and Human Cognition Reliability (HCR) (Hannaman et al., 1984). Felice et al., 2013). From such analysis measures can then be taken to reduce the likelihood ofanerrors occurring within Approach (SHERPA) (Embrey, 1986)ofetand Human They are built around the pivotal concept human error: Felice et al., 2013). From such analysis measures can then a system and therefore lead to improvement in the Cognition Reliability (HCR) (Hannaman al., 1984). be taken to toand reduce the likelihood likelihood ofanerrors errors occurring within within Cognition Reliability (HCR) (Hannaman ethuman al., 1984). 1984). They are of built the pivotal conceptofofet error: be taken reduce the occurring aoverall system therefore leadidentified to of improvement in the Cognition Reliability (HCR) (Hannaman al., because thearound inherent deficiencies humans, they be taken to reduce the likelihood of errors occurring within levels of safety. The cognitive model for They are built around the pivotal concept of human error: system andoftherefore therefore leadidentified to an an improvement improvement in They are are of builtthearound around the the pivotal conceptofof of human human error: error: because deficiencies they aaaoverall system and lead to in the the levels safety. The cognitive(Contextual model for They built naturally failtheto inherent performpivotal tasks concept just of likehumans, mechanical, system and therefore lead to an improvement in the CREAM methodology is called “CoCoM” because of inherent deficiencies humans, they overall levels of safety. The identified cognitive model for because of offailthe theto inherent inherent deficiencies of humans, they naturally perform tasks justWhile, likehumans, mechanical, overall levels of safety. The identified cognitive model for CREAM methodology is called “CoCoM” (Contextual because deficiencies of they electrical, structural components. the most overall levels of safety. The identified cognitive model for naturally fail to perform tasks just like mechanical, CREAM methodology is called “CoCoM” (Contextual naturally fail to perform tasks just like mechanical, electrical, fail structural components. While, the most CREAM methodology is called “CoCoM” (Contextual naturally to perform tasks just like mechanical, CREAM methodology is called “CoCoM” (Contextual electrical, electrical, structural structural components. components. While, While, the the most most electrical, structural components. While, the most Copyright © 2016 IFAC 1673 Copyright 2016 IFAC 1673Hosting by Elsevier Ltd. All rights reserved. 2405-8963 © 2016, IFAC (International Federation of Automatic Control) Copyright 2016 IFAC 1673 Peer review© of International Federation of Automatic Copyright ©under 2016 responsibility IFAC 1673Control. Copyright © 2016 IFAC 1673 10.1016/j.ifacol.2016.07.821
June 28-30, 2016. Troyes, France F. De Felice et al. / IFAC-PapersOnLine 49-12 (2016) 1673–1678
1674
Control Model). The “CoCoM” model is based on the four cognitive function definition: observation, interpretation, planning and execution. CREAM divides the error events into observational errors (phenotypes) and nonobservational ones. Phenotypes, which are known as error modes, are the errors that have the external manifestations. Errors, which cannot be observed, are the errors that do not have the external appearance and they occur during the human thinking process. CREAM considers that the phenotypes are the consequence of non-observational errors by certain transformation of cause to effect, while the latter is considered as the ultimate causes which lead to the human errors. The CREAM method defines nine Correction Factors CFPs: 1) adequacy of organization; 2) working conditions; 3) adequacy of MMI and operational support; 4) availability of procedures/plans; 5) number of simultaneous goals; 6) available time; 7) time of day; 8) adequacy of training and 9) preparation and crew collation quality. There are several levels of each factor to reflect its effect to human performance. In order to reflect the scenario effects on human cognitive behaviors, the CREAM method defines four cognitive control modes, which are scrambled, opportunistic, tactical and strategic (Figure 1). The procedure to assess the error probability is to add to the nine CPC levels that contribute positively (∑ improved) and those who contribute negatively (∑ reduced), getting a pair of values that are inserted in figure 1 to locate one of the four categories of control mode: 1) Scrambled: unpredictable situation, operator does not have control; 2) Opportunistic: limiting actions, lack of knowledge and staff competence; 3) Tactical: planned actions, operator knows the rules and procedures of the system; and 4) Strategic: Operator has a long time to plan its work
SCRAMBLED
2.2 SHERPA Method Sherpa technique is considered the best technology available today in terms of accuracy to estimate human errors. (Harris et al., 2005). This method includes a failure modes taxonomy (action errors, control errors, recovery errors, communication errors, choice errors) related to a behavioral taxonomy and is applied to follow the tasks hierarchical analysis of the considered scenario. The SHERPA model indicates, starting from the lowest hierarchical level of tasks, which errors are most credible in the analysis scenario. Subsequently, starting from the lowest level, each task is classified according to the behavioral taxonomy. The analyst uses the experience in the field and error taxonomy in order to determine the failure mode most credible for specific tasks. For each failure mode identified, the analyst describes the problem and establishes the consequences. After which the analyst estimates the occurrence error probability, typically using a qualitative scale: Low – Medium – High. The same rating scale is used to assess the error severity. The final step is to propose and plan strategies for error reduction, which usually consist to change the process and system. Each task is divided into activities and under activities and for each task is developed a type of error taxonomy and for each error is calculated probability, consequences in the process and remedies. The analysis is developed in three modules: 1) Interaction man – machine and man – man; 2) Failure mode identification (action, control, recovery, communicate, choice) and 3) Human error probability (HEP) quantification. HEP is determined by the nominal error probability (P0) and performance factors (PF) (Equation 1). The performance factors and the nominal error probability defined by generic tasks proposed by an HRA study. (Kirwan 1994). 𝑃𝑃 = 𝑃𝑃0 ∗ ∏𝑖𝑖 𝑃𝑃𝑃𝑃𝑖𝑖
𝑃𝑃𝑃𝑃𝑖𝑖 = [(𝐶𝐶𝐶𝐶𝐶𝐶𝑖𝑖 − 1) ∗ 𝐴𝐴𝐴𝐴𝑖𝑖 + 1]
Table 1 shows the error probability range.
Error Probability Range 0.5E-5 < p < 1E-2
TACTICAL
1E-3 < p < 1E-1
OPPORTUNISTIC
1E-2 < p < 0.5E0
(2)
Then the safety factor (Fs), expressed by the equation 3, is determined taking into the error probability (Pi) and the number of activities (n) entrusted to man.
Table 1. Error Probability Range STRATEGIC
(1)
The performance factor is expressed by the Equation 2, where CFE is the error condition and Ap is the factor weight.
Fig. 1. Correlation between CPCs scores and control modes
Control Modes
1E10-1 < p < 1E0
−1
𝐹𝐹𝐹𝐹 = {√∑𝑛𝑛𝑖𝑖=1 𝑃𝑃𝑖𝑖2 }
1674
(3)
June 28-30, 2016. Troyes, France
F. De Felice et al. / IFAC-PapersOnLine 49-12 (2016) 1673–1678
1675
Finally benefits impacts ratios are expressed in equation 3. A multicriteria analysis is developed to address design choices towards solutions that maximize the value ratio. 𝜋𝜋𝑛𝑛 =
𝐹𝐹𝐹𝐹𝑛𝑛 𝐼𝐼𝑛𝑛
(4)
The SHERPA advantages are: 1) Structured and comprehensive approach of the human error estimate; 2) Taxonomy that suggests potential errors; 3) Deployment speed; 4) Easy to learn; 5) Exhaustive and 6) Adaptable to all contexts. The main drawback is the difficulty of implementation for complex tasks, so the SHERPA method is integrated with other instruments to improve it. 3. CASE STUDY: A HYBRID MODEL BASED ON CREAM/SHERPA METHOD In the present section a case study within a company leading regenerator of used motor oil is analysed. Figure 2 shows the plant lay out. The analyzed scenario is the explosion of a tank (Figure 3). Fig. 4. Methodological approach 3.1 Phase#1 - Definition of Emergency Procedure First, it is important to determine the different areas inside and outside the site, classifying them in areas depending on their dangerousness explosive. It is essential to assess the sources of ignition. Furthermore, it is important to define the procedures for prevention and explosion safety (as shown in Figure 5).
Fig. 2. Plant lay out
Fig. 5. Emergency measures in case of explosion 3.2 Phase#2 - CREAM and SHERPA Method CREAM approaches the quantification in two steps by providing a basic and an extended method, as shown in Figure 6. Fig. 3. The system under study The methodological approach is shown in Figure 4.
1675
June 28-30, 2016. Troyes, France F. De Felice et al. / IFAC-PapersOnLine 49-12 (2016) 1673–1678
1676
Crew collaboration quality
The expected effect on the reliability of performance is shown in Table 2. Table 2. CPCs Table Representation and Evaluation
Adequacy of organisation
Working conditions Adequacy of man-machine interaction and operational support Feasibility of procedures and plans Number of simultaneous goals Available time Time of day Adequacy of training and preparation
Qualitative level
Expected effect
Very efficient Efficient Inefficient Deficient Advantageous
Improved Not significant Reduced Reduced Improved
Compatible Incompatible
Not significant Reduced
Supportive Adequate Tolerable
Improved Not significant Not significant
Inappropriate
Reduced
Appropriate
Improved
Acceptable Inappropriate Fewer than capacity Matching current capacity More than capacity Adequate
Not significant Reduced Not significant
Temporarily inadequate Continuously inadequate Day time Night time Adequate (high experience) Adequate (low
Not significant Reduced Not significant Reduced
Reduced Improved
Efficient Inefficient Deficient
Not significant Not significant Reduced
From Table 2 is: ∑ Improved = 2 and ∑ Reduced = 5. Cognitive functions involved in the explosion emergency are observation, interpretation and execution. For they have been assigned the following failures described in order of the sequence, which have occurred over the years in the reference: O3: Comments not executed due to oversights; I2: Bugs decision because analysis is not performed or incomplete; I3: Interpretation not timely; E2: Execution of actions not timely; E3: Perform actions on incorrect items; E5: Do not perform actions. Starting from the table of CFPs corrective factors and nominal values of CFPs, are determined “weighting factors” to adjust nominal values of CFPs and obtain the final values of Cognitive Error Probability (Table 3). From Table 3 is possible determine the value of Cognitive Failure Probability. The probability value is included in the “opportunistic” control mode range (1.0E-2 < p < 0.5E0).
Fig. 6. CREAM Method
CPCs
experience) Inadequate Very efficient
Table 3. Nominal Value of CFPs and adjusted CFPs Error Mode
Weighting factor
Adjusted CFP
O3 (3.0E-3)
9.6
2.88E-2
I2 (1.0E-2)
0.96
0.96E-2
I3 (1.0E-2)
0.96
0.96E-2
E2 (3.0E-3)
28.8
8.6E-2
E3 (5.0E-4)
28.8
1.44E-2
E5 (3.0E-2)
28.8
8.64E-1
The scenario analysis using the methodology CREAM has revealed some weaknesses. The first is that CREAM ignores the variety of tasks assigned to the operators. Then CREAM ignores the manner of error associated with a cognitive function is independent from that relating to another function. For the above reason SHERPA method was applied, according the procedure established in Figure 7.
Not significant Reduced Improved
Improved Not significant 1676
June 28-30, 2016. Troyes, France
F. De Felice et al. / IFAC-PapersOnLine 49-12 (2016) 1673–1678
1677
It is possible to improve the management of emergency process, maximizing the relationship between the benefits and impacts. Definitely, SHERPA method ignores control mode operator and it does not include the positive effect on the environment and the external environment. 3.3 Phase#3 - A Hybrid Model Results show that CREAM and SHERPA methods have some weaknesses. For this reason, in the present section a Hybrid model is presented. The methodological approach is shown in Figure 8. Fig. 7. SHERPA Method The following phases of the process were evaluated: 1) Shut down of plants; 2) Implementation of safety procedures; 3) Execution ordered the operations envisaged in the filling process; 4) Compression of gas to moderate pressures; and 5) Continuous monitoring of the level of hydrogen pressure. Table 4 shows the Probability vector of human error. Table 4. Probability vector of human error Step Error Mode 1 Simple task Routine 2 task 3 Simple task 4 Simple task Respond correctly to 5 a command of the system
CFE 8
Weight 0.4
PF 3.8
P0 0.16
P 0.608
3
0.8
2.6
0.2
0.52
5 11
0.2 0.4
1.8 5
0.16 0.16
0.288 0.8
3
1
3
0. 2E-5
0. 6E-5
At this point it is possible to define the safety factor (Fs) in order to take into account both the extent of the probability. From Equation 3, Fs=0.85. It is also possible to carry out a multicriteria analysis (Table 5).
Fig. 8. Flow chart of Hybrid Model The general framework is equal to SHERPA method but it is possible to note that the first striking feature of this model is that the methodology CREAM inherits only the basic process. Then the summation of the ameliorative effects must be greater than or equal to 5. Definitely, it is possible to evaluate a new probability vector (as shown in Table 6).
Table 5. Multicriteria Analysis (π)
Benefits
Impacts
1
Fs1
I1
2
Fs2
I2
According to company manager the following values were established: I1 = 0.50; Fs2 = 0.70 e I2 = 0.40.
Table 6. New Probability vector of human error
Comparing the relationship between benefits and impacts, it gets: 𝐹𝐹𝐹𝐹1 = 1.76 𝜋𝜋1 = 𝐼𝐼1 𝜋𝜋2 =
Step 1 2
Error Mode Simple task Routine task
P 0.608 0.52
3
Simple task/ Routine task
0.288
P0 0.608 0.52 0.52 * 0.288 = 0.15
0.8
0.8
0. 6E-5
0. 6E-5
4
𝐹𝐹𝐹𝐹2 = 1.75 𝐼𝐼2
5
1677
Simple task Respond correctly to a command of the system
June 28-30, 2016. Troyes, France F. De Felice et al. / IFAC-PapersOnLine 49-12 (2016) 1673–1678
1678
It is important to note that for step 3 the probability of error, is equal to the product of their own P2 to P3. In this case from Equation 3, Fs=0.88. According to company manager the following values were established: I1 = 0.50; Fs2 = 0.70 e I2 = 0.40. The comparison between benefits and impacts, it gets: 𝜋𝜋1 =
𝐹𝐹𝐹𝐹1 = 1.76 𝐼𝐼1
𝜋𝜋2 =
𝐹𝐹𝐹𝐹2 = 1.75 𝐼𝐼2
Engineering and Technology, Volume 5, Issue 5, 2013, Pages 4450-4464. De Felice, F., Petrillo, A. (2012). Methodological approach to reduce train accidents through a probabilistic assessment. International Journal of Engineering and Technology, Vol. 4 pp. 500-509. Di Pasqule, V., Miranda, S., Iannone, R., Riemma, S. (2015). A Simulator for Human Error Probability Analysis (SHERPA). Reliability Engineering and System Safety, Vol. 139, pp. 17-32. Embrey, D.E. (1986) SHERPA: A systematic human error reduction and prediction approach. Proceedings of the International Topical Meeting on Advances in Human Factors in Nuclear Power Systems, Knoxville, Tennessee American Nuclear Society La Grange Park, Illinois. Fujita Y, Hollnagel E. (2004). Failures without errors: quantification of context in HRA. Reliability Eng Syst Safety 2004;83:145–51. Gertman, D. I., Blackman, H. S., and Others (2004). The SPAR-H human reliability analysis method (NUREG/CR-6883). US Nuclear Regulatory. Hanaman, G. W., Spurgin, A. J., and Other, (1984). Human cognitive reliability model for PRA analysis. Draft Report NUS-453 Electric Power Research Institute. Harris, D., Neville, A., and Others, (2005). Using SHERPA to Predict Design-Induced Error on the Flight Deck, Aerospace Science and Technology, vol. 9, pp. 525-532. Hollnagel, E., (1998) Cognitive reliability and error analysis method- CREAM. Elsevier Science, London. Kirwan, B., (1994). A Guide to practical Human Reliability Assessment. Taylor & Francis, London. Stanton, N.A., Harris, D., Salmon, P.M., Demagaiski, J.M., Marshall, A., Young, M.S., Dekker, S.W.A.,Waldmann, T. (2006). Predicting design induced pilot error using HET - a new formal human error identification method for flight decks. Aeronaut. J. 110, 107-115. Su, X. Mahadevan, S., Xu, P., Deng, Y., (2014). Inclusion of task dependence in human reliability analysis. Reliability Engineering and System Safety, pp. 41–55. Swain, A. D., Guttmann, H. E., (1983). Handbook of human reliability analysis with emphasis on nuclear power plant applications (NUREG/CR- 1278), US Nuclear Regulatory Commission, Washington, DC. Swain, A.D., 1987. Accident sequence evaluation program human reliability analysis procedure, NUREG/CR4772. USNRC, (2000). Technical basis and implementation guidelines for A Technique for Human Event Analysis Yang, Z.L., Bonsall, S., Wall, A., Wang, J., Usman, M. (2013). A modified CREAM to human reliability quantification in marine engineering. Ocean Eng. 58, 293– 303.
Since, also in this case (1.76 > 1.75) it means that it is possible to confirm the choice made. 6. CONCLUSIONS Human errors are the major causes of industrial accidents. The complexity of the system and lack of reliable data deteriorate the ability of human operators in providing precise and significant statements about system behaviors. Thus, in the present research, a hybrid method for the estimation of the probability of a human erroneous action in specific industrial and working contexts has been developed. The hybrid method was based on the CREAM and SHERPA methodology. The results obtained are very satisfactory and in the range of the expectations compared to the probability intervals provided by CREAM and SHERPA. The implementation of the hybrid model encourages operators to increase their control over the way their skills. The major advantage of the new approach is that it is possible to quantify the probability of error in a simple and intuitive way. Furthermore, apart from the industrial field, it seems promising that the model can be applied to other fields. Future research aims to integrate fuzzy logic in the human errors analysis. REFERENCES Bruzzone, A.G., Frascio, M., Longo, F., Chiurco, A., Zanoni, S., Zavanella, L., Fadda, P., Fancello, G., Falcone, D., De Felice, F., Petrillo, A., Carotenuto, P. (2014). Disaster and emergency management simulation in industrial plants. Proceedings of the International Conference on Modeling and Applied Simulation (EMSS 2014). September 10-12 2014, Bordeaux, France, pp 649-656. Cacciabue P. (2004). Human error risk management for engineering systems: a methodology for design, safety assessment, accident investigation and training. Reliability Eng Syst Safety 2004;83:229 40. Cooper, S.E., Ramey-Smith, A.M., Wreathall, J., Parry, G.W., Bley, D.C., Luckas, W.J., Taylor, J.H., Barriere, M.T. (1994). A technique for Human Error Analysis (ATHEANA), NUREG/CR-6093, US Nuclear Regulatory Commission, Washington. De Felice, F., Petrillo, A., Carlomusto, A., Romano, U. (2013). Modelling application for cognitive reliability and error analysis method. International Journal of 1678