Risk Assessment of Human Machine Interaction for Control and eNavigation Systems of Marine Vessels

Risk Assessment of Human Machine Interaction for Control and eNavigation Systems of Marine Vessels

9th IFAC Conference on Control Applications in Marine Systems The International Federation of Automatic Control September 17-20, 2013. Osaka, Japan R...

890KB Sizes 0 Downloads 16 Views

9th IFAC Conference on Control Applications in Marine Systems The International Federation of Automatic Control September 17-20, 2013. Osaka, Japan

Risk Assessment of Human Machine Interaction for Control and eNavigation Systems of Marine Vessels Axel Hahn*, Andreas Lüdtke** *University of Oldenburg, 26121 Oldenburg, Germany, (Tel: +49 441 798 4480; e-mail: [email protected]). **OFFIS, 26102 Oldenburg, Germany, (e-mail: [email protected]) Abstract: The design of human machine interaction for modern ship electronic systems has a significant impact on the safety of seafaring. There is a direct impact on the capability of the nautical personnel to efficiently recognize the situation and to decide correctly. Consideration of efficiency and safety in early design phases of eNavigation systems or other bridge equipment will improve the engineering processes in order to reduce design errors. Model driven design is one option for efficient development of technical systems that proves as a powerful tool getting popular due to the reusability of the models in verification, code generation etc. and leads to efficient processes. Based on former research on system engineering and on human machine interaction risks assessments using cognitive simulation this paper presents an integrated model based approach for analysis safety of human machine interactions. It combines technical, process and cognitive models for simulation based efficiency and risk assessment in bridge design. Keywords: Navigation, Safety Analysis, Risk Assessment, Marine systems, Simulators, Human Machine Interface. Provide an administrative human machine interface task concept for identifying updates and setting of presentation rules • Implement harmonized presentation concept of information exchanged including standard symbols and text support taking into account human factors and ergonomics design principles to ensure useful presentation and prevent overload • Develop a holistic presentation library as required to support accurate presentation across displays • Harmonization of conventions and regulations for equipment • Improved display of status of available data and indication of available updates • Task-based information management These requirements indicate why equipment providers should do a comprehensive usability and risk assessment of their products. IMO MSC Circular 878 states: ‘A single person error must not lead to an accident. The situation must be such that errors can be corrected or their effect minimised. Corrections can be carried out by equipment, individuals or others. This involves ensuring that the proposed solution does not rely solely on the performance of a single individual.’ Adaptive interfaces may provide the right information at the right time without inducing information overload. With a research background in system engineering / model driven design we propose the usage of models for usability and risk assessment in model driven engineering processes. During early design states adequate models of the product under development can be used to simulate the usage of the system to ensure safety and usability requirements. We bring together system engineering with simulation of cognitive models to make a risk assessments system including humans •

1. INTRODUCTION Seafaring is and was always a joint undertaking between humans and their technology. Beside the impact of nature, such as wind, waves, etc. the reliability of the technical equipment and its correct usage ensure safe voyaging. This is still true with the introduction of eNavigation technology. The eNavigation implementation process is accompanied by IMO's NAV and COMSAR sub-committees, as well as the International Hydrographic Organization (IHO) and the International Association of Lighthouse Authorities (IALA). These institutions did a comprehensive gap analyses as a part of their development of a joint implementation plan for eNavigation. In a ten year survey [GP07] investigated the causes of collisions and groundings, in which human error was the primary cause with 60%. Therefore the gap analysis of the IMO addresses numerous aspects of human machine interaction [IMO12], e.g. absence of structured communication link to notify incorrect operation of both shipboard and/or shore-based systems together with a lack of intuitive human-machine interface for communication and navigation means. Furthermore, the analysis revealed that existing performance standards/guidelines are not applied or are missing such as guidelines for usability evaluation [IMO12]. Analysing these gaps the IMO working groups identified user interface related requirements. They request: • • •

Ergonomically improved and harmonized bridge and workstation layout Single-entry of reportable information in singlewindow solution Integration and presentation of available information in graphical displays

978-3-902823-52-6/2013 © IFAC

368

10.3182/20130918-4-JP-3022.00004

IFAC CAMS 2013 September 17-20, 2013. Osaka, Japan

in control or execution function. The paper introduces a simulation environment to evaluate the product design based on several simulators to simulate traffic, vessel and systems response and human behaviour. Then we make a short digression about model based design because we want to demonstrate how to use models of first technical system design for usability analysis by using a co-simulation system. We propose a simulation environment and present the idea of using cognitive models to simulate seafarers (erratic) behaviour. So engineers can make first usability assessment in early engineering phases and avoid expensive redesign in latter design phases.

from usability problems). Starting point has to be an understanding of the HMI induced hazards and to check the risks during system design. Systematically the risks and vulnerability of the system have to be checked on every design stage of the system. Risks overseen during first design concepts may never be detected or be detected too late inducing high costs due to required design changes.

2. MODEL DRIVEN ENGINEERING AND VERIFIATION OF ENAVIGATION EQUIPMENT Engineering new systems requires a broad understanding of technologies to be selected and applied to the design and methodologies to handle complexity of the undertaking. Therefore engineering applies methodologies (to define tasks and their order), methods and tools (to support the tasks / how they are done) in addition to their technological knowledge [PWL07]. Engineering itself is an iterative process of synthesis and analysis tasks. During synthesis concepts and technologies are selected, applied and the design is elaborated: The system is under design. Then engineers validate (fulfils the system the right requirements) and verify (are the requirements implemented correctly) their design. As early and iteratively as possible engineers can thus validate and verify their design developments to reduce cost and safe time. In electrical engineering Bell Laboratories introduced the concept of system engineering in the 1940s [Sch56]. To understand the product as a system with dedicated sub elements, a system border and defined relationships can help to manage complexity. With the advent of technologies to describe elements and relationship in a reusable way by using computer models made this approach also popular in other engineering domains.

Fig. 1. IMO Formal Safety Assessment Methodology Based on this requirement the authors propose a test bed for the required risk assessment to be implemented in a model driven design process by simulation systems to evaluate the designs. The system models are used by the simulator to systematically check hazards. A requirement with regard to the models is that they executable in a simulation environment. 3. SIMULATION SYSTEM For the risk assessment we propose the approach depicted in Figure 2. The figure shows that the system model designed during the engineering process will be analysed by a cosimulation implemented in a joint simulation environment. Embedded in the engineering process the objective is to analyze the actual system model under design. For this the potential hazards are identified as requested according to the FSA methodology (upper left of Figure 2). The new system will be tested in a simulation of the environment of the system: the vessel in its actual traffic / manoeuvre situation (right simulator in the simulation environment which uses vessel and traffic models). The system may be used cooperatively by several users. They are implemented as agents in the simulation (left element in the environment in Figure 2). Cognitive models define the behaviour of the agents with respect of the normative processes which are defined by regulations. The embedding of the system model in the simulation depends on the phase of the engineering process. During the early design phases first functional models of the system can be analysed with the simulator. User agents directly apply a functional model of the system that is fed by a simulation of the environment of the vessel. In later design phases the system is elaborated and the user interface may be fully designed. Now its usage has to be analysed in context of a specific interface layout. The user interface specification has to be used to generate a virtual representation in a virtual bridge environment to be used by moving and interacting agents. The functional implementation of the new control or eNavigation system can run as a software sub system which is embedded in the

Reusable computer models of the system under design (the system model) allow continuous flow of information between the different tasks and simple implementation of the mentioned synthesis/analysis loop. Paying attention to the early phases of system design (identify and validate/verify the concepts of the product) reduce the risk of later design changes which are mostly cost intensive. Using reusable models during the early phases that are later usable for design, validation and verification is named frontloading and aims at improving design efficiency. The usage of models and automatically generated implementation specifications from them is called model driven design. The IMO Formal Safety Assessment Methodology (s. Fig 1) requires hazard identification and a following risk assessment [IMO02]. It will improve the engineering process to apply this assessment already as soon as possible and iteratively during the design process to identify problems and follow the next steps of the FSA methodology as a concurrent process. This has to address all kinds of hazards. All systems including humans and machinery are sensitive to errors induced by human machine interaction problems (e.g. derived 369

IFAC CAMS 2013 September 17-20, 2013. Osaka, Japan

behaviour to identify its causes. To achieve interoperability of the different simulators system the simulation models are harmonized by means of a semantic model expressed in UML [PP05]. In addition the OMT specifications are derived from this semantic model.

simulation (software in the loop). Input from sensor systems and from communication to other systems is generated by the simulator. The functional implementation has to be connected to the virtual representation of the user interface in the virtual bridge to communicate with the user agents (s. Figure 2). The identified hazards are used by the simulation environment to seek systematically risky/hazardous situations. The behaviour of the system is protocolled for further consideration in the engineering process.

Fig. 3. Co-Simulation Environment 4. ENVIRONMENT AND OPERATION MODELS

Fig. 2. Simulation based Risk Assessment

The test cases are defined by a specification of the normative behaviour of the nautical personnel and the physical environment of the vessel to generate the situation in which the system under development is tested. The modelling approach for the normative behaviour of the modelled entities is based on a process model of nautical operations and specification of the interaction between the personnel and the technical systems. To edit the behaviour the authors propose a swim lane based process editor. All activities of an agent (person or technical system) are represented in the columns. All activities are ordered temporally from top to bottom. Movements of vessels can be modelled in a scene editor displaying either the traffic situation or the bridge layout. To allow seamless interoperability of the different simulation systems all simulators in this co-simulation approach require the same understanding of the simulated scenarios. Therefore all simulators use a common semantic model that is applied to model the test cases for the bridge system under development. For modelling the normative behaviour (the specification of operation according to the regulation) a semantic model is designed (Figure 4). The concept is based on the model of BPMN [SIL11]. For modelling the environment we follow a system of system engineering approach. Similar approaches can be found in DoDAF in modelling military operations [DoD09] or in INCOSE [INC07], SYSML [OMG06] or STEP AP233. The adopt these to design our semantic model a common data model to model the environment. It is used to model the bridge and the vessel in its traffic situation. The models then are executed with different simulators (n-Body simulators, traffic simulators, etc.) within the co-simulation system platform. The editor shown can also be used to model simple environments in parallel to modelling the normative behaviour.

The approach requires a co-simulation of different simulators and the embedded software requires software in the loop simulation. For the co-simulation we use a co-simulation run time environment as depicted in Figure 3. In our case it is a High Level Architecture (HLA) compliant implementation [DFW97] which components / simulators are described in the following: The basis is a run time system with a HLA conform run time system interface (RTI). For easier access the integration of different simulation systems is done by wrappers to map the native simulator interface to the RTI. The wrapper in this case is a software component that fulfils the functionality to combine different simulation systems. HLA offers synchronisation and data exchange during the simulation and offers shared objects which can be accessed by the different simulators to align internal models. Signal and shared objects are defined by HLA Object Model Templates (OMT). The environment of the analysed system is simulated by physical simulation engines. For example marine vessel simulation system for the assessment of vessel control applications is used to be extended by an n-body simulator to test assistance systems e. g. for crane operation. The normative behaviour is described by a process specification. They may be directly executed by agents that control persons or equipment. Agents representing persons can be implemented also by agents in CASCaS (Cognitive Architecture for Safety Critical Task Simulation). CASCaS (cf. Chapter 5) is cognitive architecture able to simulate characteristic human behaviour including eye movements, attention allocation, decision making and human actions. The overall simulation is controlled by a simulation controller. It takes the hazard specification into account and uses special observer software components to track specific attributes during simulation. Using rare event simulation technology the Simulation Control systematically configures the simulators and adopts parameters in different simulator runs to provoke erratic system behaviour and analyses the 370

IFAC CAMS 2013 September 17-20, 2013. Osaka, Japan

external visual and auditory events, to simulate crew coordination (e.g. call-outs, check list reading), to perform multiple tasks concurrently (multitasking), to interpret perceived data (translation of signs to symbols), and to handle unfamiliar situations. The FP7 project ISi-PADAS developed models of car driver behaviour and applied them to support risk based design [CV09], [LWO09] for the assessment of Partially Autonomous Driver Assistance Systems (PADAS). ISi-PADAS developed a Joint Driver Vehicle Environment (JDVE) Simulation Platform. The JDVE serves as the core technical infrastructure of the simulation process and as a flexible framework for the integration of different models and tools. Based on the knowledge gained from field studies a set of heterogeneous driver models have been developed. One of these models has been developed using the same core cognitive architecture which has been used in HUMAN for building the aircraft pilot models. A simulation model of the driver has been developed and has been connected to the JDVE as a replacement for human drivers. It proved to be able generating relevant aspects of human driver behaviour. Critical situations are most relevant for the risk assessment process but do not occur very often even for comparable large number of simulations. To provide a more sophisticated estimation of probabilities of critical but rare situations an approach based on Extreme Value theory has been used. To handle all these processes a support tool for the risk analyst has been created. The user can assess the risk by decomposing all possible scenarios starting from some selected initial situations. The tool supports the process by providing Expanded Human Performance Event Trees (EHPET) as structural component for the decomposition of scenarios. As mentioned above, the cognitive architecture CASCaS (Cognitive Architecture for Safety Critical Task Simulation) has been used to model human behaviour of aircraft pilots as well as car drivers. CASCaS models generic domain independent cognitive processes in a modular way taking into account human perception, memory and knowledge processing and motor performance. A key concept underlying CASaS is the theory of behaviour levels which distinguishes tasks with regard to their demands on attentional control dependent on the prior experience: autonomous behaviour (acting without thinking in daily operations), associative behavior (selecting stored plans in familiar situations) and cognitive behaviour (coming up with new plans in unfamiliar situations). Figure 5 shows the modular structure of CASCaS. The knowledge processing component encompasses one layer for each behaviour level. Car driver behaviour has been modelled mainly on the autonomous layer based on Bayesian networks. Aircraft pilot behaviour has been modelled mainly on the associative layer based on rule-based knowledge processing. The percept and motor component of the architecture interact with a simulated environment by reading and manipulating external variables. The Simulation Environment Wrapper provides data for the percept and motor component by connecting CASCaS to different simulation backends. Few researchers have attempted to model cognitive processing activities that take place when decisions and actions are carried out on a ship bridge. Itoh et al. [IYH01] modelled the influence of human

Fig. 4. Semantic Model of the Operation Planner 5. COGNITIVE MODELS Cognitive models are used to model characteristic human behaviour within the maritime environment described above. Cognitive models of human operators usually consist of two parts: a cognitive architecture, which integrates task independent cognitive processes (like perception, memory, decision making, learning, and motor actions) and a task model. Our task models describe a task as a temporally ordered hierarchical tree of goals (e.g. berthing), sub goals (e.g. obtain clearance to enter port, manoeuvre to parallel position to berth, manoeuvre ship to final position) and actions (e.g. reduce speed to less than 3kn, communicate with Tug master). The task tree shows for each sub goal the involved mental and behavioural steps. Annotations can be added to capture responsibilities, conditions, duration, frequency and priorities. Task modelling is supported by the task modelling and analysis tool PED (Procedure Editor). To simulate human behaviour the task model has to be ‘uploaded’ to a cognitive architecture. A cognitive architecture can be understood as a generic interpreter that executes such formalised task models of activities in a psychological plausible way. Cognitive modelling of human agents is concerned with formalizing characteristics of cognitive processes involved in task performance. There is a growing consensus that design support systems can be created by using accurate models of cognitive behaviour. The FP6 project ISAAC [LCC06] and the FP7 project HUMAN have shown that cognitive models (of aircraft pilots in this case) can be successfully applied to predict human-machine interaction ([LO10], [FOL10], [LOM10], [LJH09], [LOM09]). The objective was to develop a methodology with techniques and prototypical tools for prediction of human errors in ways that are usable and practical for human centred design of systems. The methodology was developed and tested based on a flight management system (FMS) case study. One goal was to improve the accuracy of pilot error prediction using flight deck simulation including models of aircraft pilot behaviour. HUMAN made significant progress with regard to the realization of the capabilities to simulate human visual field and attention, to dynamically react to 371

IFAC CAMS 2013 September 17-20, 2013. Osaka, Japan

errors (e.g. erroneous identification of the current ship position) on steering a ship. Von Kalus et al. [KAC99] developed a highly simplified model of collision avoidance manoeuvres in an electronic warfare environment using the cognitive architecture SOAR. In the German project DGON Bridge Brüggemann et al. [BMS08] developed an initial model of decision making of nautical officers (Nautik PSI) based on the psychological theory PSI. The authors are currently extending CASCaS towards building Cognitive Seafarer Models. First results indicate that human behaviour and cognitive models can be reused. But the cooperation between the humans and the distributed decision competencies are remarkably different. Additionally the dynamics of the system were significantly slower. Therefore the cognitive architecture is extended with nautical decision making processes involving interaction and communication between agents. These extensions are based on the psychological theories Recognition-Primed Decision Making (RPD) [Kle89] and Distributed Cognition [HHK00] and are under evaluation actually.

complete and shared operational picture. Further on, this integrated approach can support trajectory management, remote pilotage etc. In this approach adequate models of the product under development are used to simulate the usage of the system. The simulation environment allows verifying the product design based on several simulators to simulated traffic, vessel response and human behaviour to simulation. Hereby, a new eNavigation system can be tested in a simulation of the environment of the system, e.g. a vessel in its actual traffic/manoeuvre situation and by cognitive models that define the behaviour of agents with respect to the normative processes defined by regulations. This would especially apply to situation awareness and visual/acoustical workload during alert situations and other critical tasks: With this approach system designer can prove that the perceivable information is suitable to get adequate situation awareness so the personal can do the right decisions in time and the system is tolerant to user faults. The perceivable situation models are compared to the required situation understanding and the decision done by the simulated personal is supervised to identify design flaws. In later design phases of the elaborated system, user interfaces may be fully designed and further be used to analyse a specific bridge layout for example. 7.

The authors presented a simulation approach for the assessment of human machine interaction during system design. Using a model driven design process for specifications of the control or eNavigation system can be used in a simulation environment to detect critical aspects of the design. The simulation integrates simulators in a cosimulation approach. Normative behaviour of agents (humans / machinery / vessels) are executed in an agent simulator and the human agents behaviour is executed by using cognitive models to integrated cognitive capabilities in the simulation. The authors proposed to use a semantic model as a basis for describing the scenarios to ensure interoperability of the simulators. Next steps of our work will be the extension of the normative behaviour specification by the capability to model required information for decision support. And a simulation configuration system is under development to control the simulation to systematically identify and analyse critical situations. In order to improve design efficiency virtual and physical demonstrators are used at early design phases to evaluate usability of elaborated control and eNavigation systems.

Fig. 5. Modular Cognitive Architecture CASCaS 6.

CONCLUSION

EVALUATION OF SYSTEM DESIGN

[GP07] has identified that cross-checking of decisions and actions of one operator by another operator can reduce the risk by the factor of ten. Technical systems or Vessel Traffic Systems (VTS) can also take the role of a cross-checking agent to avoid single person errors. Further, the mentioned survey identified that most incidents are outside of VTS areas. This might suggest that VTS works quite well. We used the case of Electronic Chart and Data Information System (ECDIS) and VTS integration as a use case for safety assessment. In ECDIS and VTS independently a common operational picture of the vessels environment is set up and is displayed to the user. ECDIS uses the vessel based sensors (e.g. radar, Automatic Identification Systems (AIS)) whereas the VTS uses similar sensors which are positioned around the observed area. Radar tracking, AIS etc. are applied to analyse and track the traffic situation, obstacles etc. As a use case for integrated eNavigation systems we will analyse the development of new ECDIS Systems that are able to assess VTS data to build, use and display a joint operational picture by integration of the recognized traffic situation by the sensors on board and from shore. The advantage is a (more)

This research has been performed with support from the EU FP7 project CASCADe, GA No.: 314352. Any contents herein are from the authors and do not necessarily reflect the views of the European Commission. 9.

REFERENCES

[BMS08] Brüggemann, U., Meck, U. and Strohschneider, S. Virtuelle Nautiker als Probefahrer. In KI 22 (3), 2008 [CV09] Cacciabue, P. C. and Vollrath, M.: The ISi -PADAS Project - Human Modelling and Simulation to support Human Error Risk Analysis of Partially Autonomous Driver Assi stance Systems. In Cacciabue, C., Hjälmdahl,M., Lüdtke, A., Riccioli, C. (eds). Human Modelling in Assisted 372

IFAC CAMS 2013 September 17-20, 2013. Osaka, Japan

Modelling in Assisted Transportation – Models, Tools and Risk Methods. Milan: Springer, 2009 [LM02] Lüdtke, A., Möbus, C. & Thole, H.J., Cognitive Modelling Approach to Diagnose Over-Simplification in Simulation-Based Training. In St. A. Cerri, G. Gouarderes & F. Paraguacu (Eds), Intelligent Tutoring Systems, Proceedings of the 6th International Conference: 496-506, Springer, Berlin, 2002 [LO10] Lüdtke, A., Osterloh, J.-P., Modeling Memory Effects in the Operation of Advanced Flight Management Systems. In G. Boy, M. Feary, P. Palanque (eds), Proceedings of the International Conference on HumanComputer Interaction in Aeronautics (HCI-Aero 2010). ACM Digital Library, 2010 [LOM10] Lüdtke, A., Osterloh, J.-P., Mioch, T., Capability Test for a Digital Cognitive Flight Crew Model. In: W. Karwowski and G. Salvendy (eds), Proceedings of the 3rd International Conference on Applied Human Factors and Ergonomics (AHFE 2010), USA Publishing, 2010 [LOM09] Lüdtke, A., Osterloh, J.-P., Mioch, T., Rister, F., Looije, R. Cognitive Modelling of Pilot Errors and Error Recovery in Flight Management Tasks. In J. Vanderdonckt, P. Palanque and M. Winkler (eds), Proceedings of the 7th Working Conference on Human Error, Safety and Systems Development (HESSD 2009), LNCS 5962. Springer Verlag, 2010 [LWO09] Lüdtke, A. Weber, L., Osterloh, J.-P., Wortelen. B. Modeling Pilot and Driver Behaviour for Human Error Simulation. In Vincent G. Duffy (ed.), Proceedings of the Second International Conference on Digital Human Modeling (ICDHM), Held as Part of HCI International 2009, Lecture Notes in Computer Science (LNCS 5620), pp. 403 – 412., Springer, 2009. [OMG06] OMG Systems Modeling Language (OMG SysML) Specification - Final Adopted Specification ptc/0605-04 – 6, 2006 [PP05] Pilone, D., Pitman, N., UML 2.0 in a Nutshell, O’Reilly, Sebastopol, 2005 [PWL07] Pahl, G., Ken W., Blessing. L., Engineering Design: a Systematic Approach, Springer, London, 2007 [Sch56] Schlager, J. Systems Engineering: Key to Modern Development, IRE Transactions EM-3 (3): 64–66, 1956. [SIL11] Bruce Silver: BPMN Method and Style, 2nd Edition, with BPMN Implementer's Guide: A Structured Approach for Business Process Modeling and Implementation Using BPMN 2, Cody-Cassidy Press, 2011 [WSV83] Wickens, C., Sandry, D., and Vidulich, M., Compatibility and resource competition between modalities of input, central processing, and output. Human Factors, 25(2): 227-248, 1983.

Transportation – Models, Tools and Risk Methods. Milan: Springer, 2009 [DFW97] Dahmann, J.S., Fujimoto, R.M., Weatherly, R.M., The Department of Defense High Level Architecture, Andradóttir, S., Healy, K. Withers, D., Nelson, B. (Eds.), Proceedings of the 29th conference on Winter simulation (WSC '97), IEEE Computer Society, Washington , 1997 [DoD09] Department of Defence, DoD Architecture Framework, Version 2.0, http://dodcio.defense.gov/ Access 2013-01-20 [FOL10] Frische, F., Osterloh, J.-P., Lüdtke, A., Simulating Visual Attention Allocation of Pilots in an Advanced Cockpit Environment. In T. E. Pinelli (ed), Selected Papers and Presentations Presented at MODSIM World 2010 Conference & Expo. Hanover, MD: NASA Center for AeroSpace Information pp. 713-729, NASA/CP-2011-217069/Part 2, 2010 [GC01] Gore, B., Corker, K., Human Performance Modeling: A Cooperative and Necesssary Methodology for Studying Occupational Safety, in Bittner, Champney, Morrissey (Eds.) Advances in Occupational Egonomics and Safety, IOS Press, Amsterdam, 2001 [GP07] Gale, H., Patraiko, D., Improving Navigational Safety, Seaways, July, 2007 [HHK00] Hollan, J., E. Hutchins and D. Kirsh: Distributd Cognition: Toward a New Foundation for Human-Computer Interaction Research. In: ACM Transactions on ComputerHuman Interaction, 7(2), pp. 174-196, 2000 [IMO02] IMO, Guidelines for Formal Safety Assessment (FSA), for the use in the IMO –Rule-Making Process, MSC/Circ.1023, 2002 [IMO10] IMO, Adoption of Performance Standards for Bridge Alert Management, Annex 21 Resolution MSC.302(87), Document MSC 87/26/Add.1, IMO, 2010 [IMO12] IMO, e-Navigation, Sub-committee on Safety of Navigation, NAV 58/WP.6/Rev.1, 2012 [INC07] INCOSE, Model Based System Engineering, INCOSE-TP-2004-004-02, Version 2.03, 2007 [IYH01] Itoh, K., Yamaguchi, T., Hansen, J.P. and Nielsen, F.R. Risk analysis of ship navigation by use of cognitive simulation. In Cognit. Technol. Work, 3., 2001 [Kle89] Klein, G., Recognition-primed decisions, in Advances in Man-Machine Systems Research, W.B. Rouse, Editor. JAI Press: Greenwich, CT. p. 47-92., 1989 [KAC99] Kalus, A., Ashford, A.J., Cook, C.A., Sheppard, C. Cognitive Modeling for Decision Support in Naval Command and Control. In Proceedings of MAICS, AAAI, 1999 [LCC06] Lüdtke, A.; Cavallo, A.; Christophe, L.; Cifaldi, M.; Fabbri, M. & Javaux, D. Human Error Anal ysis based on a Cognitive Architecture. In F. Reuzeau, K. Corker and G. Boy (eds), Proceedings of the International Conference on Human-Computer Interaction in Aeronautics (HCI-Aero 06), 20.-22.09.2006, Seattle, USA. Toulouse, France: CépaduèsEditions, 2006 [LJH09] Lüdtke A., Javaux D. and the HUMAN Consortium: The HUMAN Project – Model-based analysis of human errors during aircraft cockpit system design. In Cacciabue, C., Hjälmdahl, M., Lüdtke, A., Riccioli, C. (eds). Human

373