Accepted Manuscript Special Communication Cognitive Engineering and Health Informatics: Applications and Intersections A. Zachary Hettinger, Emilie Roth, Ann M. Bisantz PII: DOI: Reference:
S1532-0464(17)30010-2 http://dx.doi.org/10.1016/j.jbi.2017.01.010 YJBIN 2711
To appear in:
Journal of Biomedical Informatics
Received Date: Revised Date: Accepted Date:
13 July 2016 13 January 2017 17 January 2017
Please cite this article as: Zachary Hettinger, A., Roth, E., Bisantz, A.M., Cognitive Engineering and Health Informatics: Applications and Intersections, Journal of Biomedical Informatics (2017), doi: http://dx.doi.org/ 10.1016/j.jbi.2017.01.010
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Cognitive Engineering and Health Informatics: Applications and Intersections
1,2
3
4
A. Zachary Hettinger , Emilie Roth , and Ann M. Bisantz
1 Department of Emergency Medicine, Georgetown University School of Medicine, Washington, DC 2 National Center for Human Factors in Healthcare, MedStar Health, Washington, DC 3 Roth Cognitive Engineering, Stanford, CA 4 Department of Industrial and Systems Engineering, University at Buffalo, State University of New York, Buffalo, NY
1
ABSTRACT Cognitive engineering is an applied field with roots in both cognitive science and engineering that has been used to support design of information displays, decision support, human-automation interaction, and training in numerous high risk domains ranging from nuclear power plant control to transportation and defense systems. Cognitive engineering provides a set of structured, analytic methods for data collection and analysis that intersect with and complement methods of Cognitive Informatics. These methods support discovery of aspects of the work that make performance challenging, as well as the knowledge, skills, and strategies that experts use to meet those challenges. Importantly, cognitive engineering methods provide novel representations that highlight the inherent complexities of the work domain and traceable links between the results of cognitive analyses and actionable design requirements. This article provides an overview of relevant cognitive engineering methods, and illustrates how they have been applied to the design of health information technology(HIT) systems. Additionally, although cognitive engineering methods have been applied in the design of user-centered informatics systems, methods drawn from informatics are not typically incorporated into a cognitive engineering analysis. This article presents a discussion regarding ways in which datarich methods can inform cognitive engineering.
2
1. Introduction The design and implementation of health information technologies (HIT) is too often driven by optimistic assessments of the benefits to be gleaned from new technology, with less attention paid to potential negative side effects (e.g., Patterson, et al., 2002; Dierks et al., 2004; Saleem et al., 2014). Experience across a wide range of domains, including healthcare, has demonstrated that unless careful attention is paid to the work as practiced, there can be serious mismatches between the workflow assumptions of the new tools and how work is actually done (Pew & Mavor, 2007; Smith & Koppel, 2014; Woods & Dekker, 2000). This can result in changes in skills and strategies, the emergence of ‘work-arounds’ to meet work needs that are not well supported by the new tools, new types of errors, and new pathways for compromised safety (Declerck & Aimé, 2014; Flanagan, Saleem, Militello, Russ, & Doebbeling, 2013; Vicente, Roth & Mumaw, 2001; Patterson, Cook & Render, 2002). Cognitive Informatics (CI) is an interdisciplinary field comprising multiple specialties in cognitive and behavioral sciences as well as information sciences that focuses on human cognitive and collaborative processes within the context of computing and computer applications (Wang, 2008; Patel & Kannampallil, 2015). CI approaches offer the opportunity to inform design of new HIT through a careful analysis of the cognitive and collaborative requirements of the actual work being conducted by the end users, reducing the risk of unforeseen negative side effects. Patel & Kannampallil (2015) published a recent review of articles in the Journal of Biomedical Informatics (JBI) that addressed CI topics. They identified 57 papers that covered a range of topics, with decision-making, usability, distributed cognition, comprehension and errors, being the most prominent. While the presence of CI papers in JBI signals a growing recognition of the value of cognitive analysis approaches to HIT design, the authors identified a need for a broader understanding and acceptance of cognitive research within the clinical research community. Toward this aim, the present paper reviews methods and findings from the related field of cognitive engineering to illustrate the kinds of insights that cognitive analysis methods can generate to inform HIT design. Cognitive engineering and CI are both interdisciplinary fields that draw on a common pool of behavioral science methods (e.g., field observations, interview techniques, artifact identification and analysis). The purpose of the present paper is both to showcase some of the common methods that have 3
proved to be useful in HIT design and deserve wider application within medical informatics, as well as to highlight additional cognitive analysis and design methods that are unique to cognitive engineering that can provide valuable additions to the CI toolkit toward design of more effective HIT systems. We begin with an introduction to the cognitive engineering approach to system design. We review cognitive engineering methods for analysis of cognitive work. Examples drawn from the healthcare domain are used to illustrate these methods and how they result in HIT systems that more effectively support the work of individuals and teams. We particularly focus on cognitive engineering methods that are less familiar to the medical informatics community. This is followed by a review of cognitive engineering design methods that are used to translate the results of the cognitive analyses into design concepts that satisfy the support requirements uncovered by the cognitive analyses. A case study of an Emergency Medicine HIT system is then presented as a concrete illustration of cognitive engineering analysis and design methods and the practitioner performance benefits that can result from applying a cognitive engineering approach. Finally, we suggest ways in which informatics related methods of data capture and analysis can be integrated into cognitive engineering methodologies to further enhance HIT system design. 2. The Cognitive Engineering Approach to System Design Cognitive engineering is an applied field with roots in both cognitive science and engineering that has been used to support design of information displays, decision support, human-automation interaction, and training in numerous high risk domains ranging from nuclear power plant control to transportation and defense systems (Norman & Draper, 1986; Woods & Roth, 1988; Roth, Patterson & Mumaw, 2002; Endsley, Hoffman, Kaber & Roth, 2007). Its aim is to develop systems that seamlessly support the work of domain practitioners. A variety of specific cognitive engineering design methods have been developed to analyze the cognitive and collaborative demands of work (Burns & Hajdukiewicz, 2004; Elm et al., 2003; Eggleston, 2003; Endsley, Bolte, & Jones, 2003; Evenson, Muller & Roth, 20008; Militello and Klein, 2013; Vicente, 1999). These methods share a commitment to development of systems that are grounded in a deep analysis of the context of work. In medicine, cognitive engineering methods have been applied in the design of systems as varied as computerized physician ordering 4
systems (Beuscart-Zephir, Pelayo, & Bernonville, 2009); ICU displays supporting nurse situation awareness (Koch, Weir, et al., 2012); representations of family genetic history (Johnson, Johnson, & Zhang, 2005); architectural design of a neonatal ICU (Papautsky, Crandall, Grome, Greenberg, 2015); and electronic health records for dental care (Thyvalikakath et al., 2014). A unique characteristic of cognitive engineering is its dual focus on uncovering and representing both the complexities inherent in the domain that can challenge cognitive performance (e.g., complex process dynamics, goal conflicts, high risk and uncertainty) and the knowledge, skills and strategies that enable practitioners to cope with domain demands. By explicitly distinguishing domain demands from practitioner strategies, it becomes possible to disentangle practitioner strategies that are compensating for poor design (e.g., strategies that compensate for lack of critical information) and can be eliminated through improved displays (e.g., visualizations that provide direct indication of critical information) from practitioner strategies that effectively cope with inherent domain complexities (e.g., risk and uncertainty) and should be reinforced through training and decision-aids (Roth & Bisantz, 2013). Figure 1 provides a description of the core elements of cognitive engineering approaches to new system design. The process starts with knowledge capture methods such as field observations and structured interview techniques to uncover the characteristics of the work domain, the work requirements, the sources of complexity, and cognitive and collaborative demands entailed, as well as the skills and strategies used by domain practitioners. Formal methods are then employed to analyze and represent the results (cognitive analysis & representation). These include work domain analysis methods that model the intrinsic characteristics of the work to be achieved that can serve as sources of cognitive complexity challenging performance, as well as methods that model workflow and practitioner knowledge, skills and strategies within and across individuals and groups required to achieve work goals (Vicente, 1999; Elm et al., 2003). Cognitive engineering analyses use a number of data collection strategies to identify work domain demands, barriers to accomplishing work, and aspects of the environment that facilitate work. They also define practitioner skills and strategies as well as cognitive support needs. Finally, they provide the basis for identifying ‘leverage points’
5
– opportunities to significantly improve performance through more effective training, visualizations, new decision support and/or environmental redesign. Critically, cognitive engineering analyses yield cognitive support requirements that are used to guide conceptual designs. A distinguishing characteristic of cognitive engineering is an emphasis on providing explicit links between the support needs uncovered during cognitive analysis and the design concept features intended to meet those cognitive support requirements. Early aiding concepts may take the form of static drawings and storyboards that are used to elicit feedback from domain practitioners. User-feedback is then used to propel further design cycles yielding prototypes of increasing levels of fidelity. The importance of incorporating user evaluations as part of the cognitive engineering design process is highlighted by the last box in Figure 1. Proposed aiding concepts are regarded as hypotheses about what constitutes effective support that need to be tested empirically (Woods & Dekker, 2000). Cycles of user evaluation begin early in the concept development process and continue through and beyond the fielding of systems to make sure that the resulting system facilitates work as actually practiced (Roth & Eggleston, 2010). As shown in Figure 1, the cognitive engineering process is not linear but rather involves continuous discovery and feedback loops across the various stages of knowledge capture, analysis, design and evaluation. New insights into the demands of work and the requirements for cognitive support arise throughout the cognitive engineering process propelling the design forward.
INSERT FIGURE 1 HERE
3. Cognitive Engineering Approaches to Knowledge Capture
Cognitive engineering includes a variety of methods for the upfront analysis of cognitive and collaborative work. These include cognitive task analyses (CTA) methods that primarily focus on uncovering the knowledge and skills of domain practitioners 6
(Crandall, Klein & Hoffman, 2006) and cognitive work analysis methods (CWA) that primarily focus on uncovering and representing the characteristics of the work domain that shape practitioner performance (Vicente, 1999). Cognitive engineering depends heavily on inputs from domain experts, be it through participation in interviews and focus groups, observation of practitioners as they work, or solicitation of practitioner feedback on prototypes of various levels of fidelity (Bisantz, Roth & Watts-Englert, 2015; Lee & Kirlik, 2013). In general, cognitive analysts tend to combine multiple methods to gain a more complete picture of cognitive demands, practitioner knowledge and skills, and the impact of changes in technology on practitioner work. Because cognitive engineering and CI share common roots in behavioral science there is significant overlap in the kinds of knowledge acquisition methods typically used. For example field observation methods, interviews, and focus groups are common to both cognitive engineering and CI approaches (Patel & Kannampallil, 2015). Here we describe some of the knowledge capture methods that are more unique to cognitive engineering. We focus particularly on examples where these methods have been used to design innovative displays and decision-support systems for healthcare applications in the hopes that they will be more widely adopted in HIT development efforts. Brief descriptions of the most common cognitive engineering knowledge gathering techniques are presented in Table 1.
INSERT TABLE 1 HERE
3.1 Critical Decision Method One of the most widely used interview techniques is the critical decision method (CDM) developed by Klein and his colleagues (Klein, Calderwood, & MacGregor, 1989). CDM provides a structured approach for eliciting and analyzing actual challenging cases that domain practitioners have personally experienced. Analysis of past critical incidents provides one of the most important tools available to cognitive analysts for uncovering domain challenges and the factors that enabled individuals to 7
perform well (e.g., their knowledge and strategies) or contributed to poor performance (e.g., faulty mental models). Applications of CDM have included analysis of expert performance in a variety of domains including fire fighting, military command and control and neonatal hospital care (Klein, 1998). Patterson, Militello, Bunger et al. (2016) provide a detailed example of how CDM can be used to identify critical cues and expert-novice differences in the detection of sepsis in pediatric patients, with subsequent recommendations for training. Others have combined the critical incident focus of CDM interviews with more structured questions regarding standard scenarios to understand decision-making among care workers in assisted living (Bowman & Rogers, 2016). 3.2 Applied Cognitive Task Analysis The Applied Cognitive Task Analysis method (ACTA) is a structured method originally developed for less experienced cognitive analysts (Militello and Hutton, 1998). ACTA consists of a task diagram interview that generates a broad overview of practitioner tasks and associated challenges; a knowledge audit that uses a set of standard probes to identify the types of domain knowledge and skills required for task performance, together with concrete examples (e.g., perceptual skills, recognizing anomalies, applying ‘tricks of the trade’); and a simulation interview that probes the cognitive processes in the context of a specific scenario. ACTA has been used across domains, including healthcare (Shachak, Hada-Dayagi, Ziv, & Reis, 2009). 3.3 Concept Mapping Concept mapping is a structured technique that can be used in both individual and group interview sessions to uncover domain demands, expert knowledge and strategies, and the factors that contribute to suboptimal performance (Crandall, Klein & Hoffman, 2006; Hoffman & Lintern, 2006). Concept maps (such as the graphical abstract provided with this paper) are directed graphs made up of concept nodes connected by labeled links that can flexibly represent a variety of knowledge. Typically, one or more domain practitioners work to build concept maps of domain knowledge, with two cognitive analysts serving as facilitators (one guiding discussion and the other creating and projecting the concept map on a shared screen). Concept maps have been used to represent the content and structure of knowledge underlying expert problem solving and decision-making performance, such as in weather forecasting (Hoffman, Coffey & Ford, 2000). It is currently being used in our own lab to capture and organize results of focus 8
groups of nurses and of physicians discussing communication strategies and the role of new information technologies in facilitating or creating barriers to work. 3.4 Observational Methods A major source of knowledge to support cognitive engineering analyses is observing the performance of domain practitioners, often in the actual work environment or in a close analogue such as a dynamic, high fidelity simulator (Roth & Patterson, 2005). While observational studies are commonly used in CI as well, we cover it here because they provide an invaluable resource for uncovering domain complexities and poorly supported cognitive task requirements. Naturalistic observations, taken under realistic conditions allow analysts to understand the range of complexities that arise in the work environment and the strategies employed by domain practitioners to respond to work demands. One example is an observational study that examined surgical teams’ communication and coordination during lengthy, complex, surgical procedures (Christian et al, 2006). The study identified factors that complicated the cognitive and collaborative performance of operating room teams impacting safety, and the strategies that surgeons and nurses employed to coordinate work and minimize the potential for adverse events. Observational studies are particularly well-suited for identifying mismatches between how work is presumed to be performed based on formal processes and procedures and what actually takes place (Dierks et al., 2004). In many cases, observational studies uncover ‘home-grown’ tools and work-arounds that domain practitioners generate to cope with aspects of the work that are not otherwise well supported (Roth, Scott et al., 2006; Flanagan, Saleem, Militello & Doebbeling, 2013). Divergence between presumed and actual work practice can point to opportunities to improve performance through more effective support. There are a number of studies in which researchers have examined implemented HIT systems in order to understand factors related to the success or failure of these systems, including studying the integration of electronic health records into patient visits (Saleem, Flanagan, et al., 2014); understanding factors associated with work-arounds in the use of a bar-coded medication administration system (Patterson, Cook, & Render, 2002); and understanding factors impeding the use of computerized clinical reminders (Saleem, Patterson, et al., 2005). Miller & Militello (2015) also used a cognitive engineeringinformed analysis to identify barriers to use of clinical decision support (CDS) systems, 9
including inability to accommodate variations in workflow; large numbers of prioritized prompts and reminders which added to workload; and mismatches in assessment and decision strategies between clinicians and the clinical decision support system. A variant of observation studies that is more unique to cognitive engineering is the process tracing methodology. In process tracing, objective records of evolving events (e.g., control process parameters; patient vital signs) are combined with observations of practitioner activity to generate a rich “process trace” of the unfolding events. Techniques such as Exploratory Sequential Data Analysis (ESDA; Sanderson and Fisher 1994) can be applied to translate the raw sequences of events and logged data into information meaningful for design, for instance allowing practitioner behavior to be more fully understood in the context of the objective evolving situation (Woods, 1993). For example, Cook & Woods (1996) combined data on patient states (e.g. physiological parameters) with observations of practitioners using medical devices, and concurrent interviews to generate a rich process trace. Similarly, Seagull & Sanderson (2001) produced process tracing logs that combined surgical events and equipment states, as well as observed activities. Video and audio taping are sometimes used to record observational data for future review and analysis. This allows for more complete data capture than is possible with real-time observations (Hu, Arriago, Peyre, Corso, Roth & Greenberg, 2012; Seagull & Xiao, 2001). 3.5 Artifact Analysis Important cognitive engineering insights can be generated by examining the role that existing tools (or ‘artifacts’) play in supporting practitioner work (Xiao, 2005). Artifacts can range from low-tech objects such as paper ‘cheat-sheets’ and post-it notes; to specially designed, task-relevant objects (e.g. a pre-filled and labelled drug syringe; an OR checklist); to high-tech digital technologies such as information displays and health record systems. Practitioner generated artifacts can highlight the need for more effective cognitive support (Mumaw, Roth, Vicente, & Burns, 2000; Gurses, Xiao & Hu, 2009). They can also point to artifact properties that are important for effective performance and thus should be preserved or reproduced as new technology is introduced (Bisantz et al., 2010; Roth, Multer, & Raslear, 2006; Xiao, 2005).
10
Bauer, Guerlain, & Brown (2006) used artifact analysis to inform design of an HIT system for intensive care. They examined a paper-based patient flow sheet which supported both structured (grids for sequential vital sign information) and unstructured (freeform notes) data capture. Their observations revealed ways in which the paper form effectively supported work. This included that it was portable, grouped information in ways that allowed easy comparisons, flexible annotation for unique circumstances, and data to be represented in familiar notation. By observing the form in use, they were able to identify characteristics of the paper form that would need to be included in an electronic system, such as the need to support flexible, rather than sequential, information entry; to allow unstructured annotations; and to allow information to be omitted. Further, their observations and analysis revealed ways that the electronic system could provide addition functionality beyond that provided by paper, such as automated data analysis and calculations (which had to be done manually with the paper form); and the ability for multiple caregivers to access the information at once. One of the more studied cognitive artifacts in healthcare is the dry erase board (or “white board”) that until recently was found across clinical environments to track patients in rooms (Bisantz et al., 2010; Patterson, Rogers, Tomolo, Wears & Tsevat, 2010; Wears, Perry, Wilson, Galliers, & Fone, 2007; Xiao, Schenkel, Raraj, Mackenzie, & Moss, 2007). These white boards typically evolved independently on various units over time to reflect the needs of the front line users. These tools are being replaced by electronic systems, however research has shown that while electronic versions may support basic information exchange functions, they may not support other, more subtle practitioner strategies. In particular, the systems may not support strategies that exploit affordances provided by traditional white boards - to direct attention, maintain awareness of task status and coordinate work (Bisantz et al., 2010; Xiao et. al., 2007; Wears et al., 2007). (See below for a detailed case example of how cognitive engineering methods were used to design improved display concepts for an emergency department white board). 3.6 Knowledge Capture Approaches in Team-Based Settings Health care is team-based. It generally requires collaboration across multiple roles including physicians, nurses, technicians, social workers, patients and their family. As a consequence, in many cases HIT systems need to be designed to support the information needs across different roles as well as the need for shared situation awareness across 11
those roles. The knowledge capture methods described above have been successfully used to identify the cognitive and collaborative needs of multi-person teams and build support systems across a variety of domains. Naturalistic observation methods are one of the most widely used approaches to uncover the cognitive and collaborative activities of teams. For example, Hu and colleagues (2016) analyzed videotaped recordings of actual surgeries to examine the impact of surgeon’s leadership styles on operating room team performance. Interview and focus group methods can also be used to understand the different perspectives across multiple team roles. Klein and colleagues (Klein, Armstrong, Woods, Gokulachandra, & Klein, 2000) employed the critical decision method to examine the role of common ground in supporting coordination and replanning in distributed military teams. Roth, Multer & Raslear (2006) employed a combination of field observation and semistructured interviews to examine informal cooperative strategies developed by railroad workers (including train crews, roadway workers and dispatchers) that contribute to overall safety. Miller & Xiao (2006) employed semi-structured interviews to examine resource allocation strategies at different organization levels in a trauma hospital that enabled adaptability to variable tempo resource demands. There are also methods specifically tailored to capturing and representing the knowledge and strategies that underlie multi-person performance. This includes methods for social organization and communication patterns (Pfautz & Pfautz, 2009) as well as methods to analyze and represent distributed decision-making strategies and information requirements (Klinger & Hahn, 2005). In some cases, researchers have augmented the formal representations used in cognitive engineering methods to explicitly represent multi-disciplinary teams (Ashoori & Burns, 2013). 3.7 Leveraging Multiple Complimentary Methods While we have described individual methods for knowledge capture, cognitive analysts generally combine multiple methods to gain a more complete picture of the demands of the work and practitioner knowledge, skills and strategies so as to inform HIT design. For example, Parush (2015) describes a design oriented study that combined multiple knowledge gathering methods to derive requirements for a display to support shared team awareness and communication during cardiac surgery. Cognitive analysis methods including interviews with surgical team members and observations of sixteen 12
open-heart surgeries were used to understand requirements for information that needed to be shared among team members, and the context or activities during which the information sharing could take place. The analyses led to an information display that combined a timeline representation of cardiac events with both static (e.g., age and medical history) and dynamic (e.g., vital signs) information about the patient as well as graphical representations of the interactions between the patient organ systems (heartlungs) and the heart-lung machine. In another study, Baxter et al. (2005) used targeted observations to log alarm events and caregiver interactions with equipment in a neonatal intensive care unit. Data from the observations, along with interviews and document analysis were used to make recommendations regarding the design of a decision aid to support selection of ventilator settings. Nemeth et al (2015; 2016) similarly applied observations, interviews, and artifact analysis to identify information requirements to support individual and clinical team decisions and communications in burn intensive care. Peute et al. (2009) performed a study of the implementation and use of a computerized physician order entry (CPOE) system over a time period which included requirements analysis, prototype development, pilot implementation, and eventual pullback of the system. Interviews, observations, and document analysis were conducted. Findings included, (1) CPOE as implemented was not based on an integrated set of requirements and workflows across various hospital departments, (2) individuals continued to make improvements and simplifications to existing ordering processes that limited usefulness and reduced the eventual value of the CPOE, (3) there were insufficient opportunities for feedback from hospital staff to the development team, and 4) that an over-early implementation of a system that lacked necessary functionality, impeded workflow, and led to system abandonment. 4. Linking Cognitive Engineering Analyses to Design Fundamentally, cognitive engineering is an applied discipline which aims to improve performance via more effective displays, automation and/or training approaches. As a consequence, an important aspect of cognitive engineering methods is to provide a principled link between the results of the cognitive analyses and the particular features of the proposed new displays/automation/training programs they have informed. Typically, 13
this is achieved by producing artifacts that serve as traceable links between the deficiencies observed in the current work environment that lead to suboptimal performance, the implication for cognitive support requirements, and the features of the resulting system that are intended to satisfy the cognitive support requirements. One of the differentiators of cognitive engineering from other approaches, including CI, is that it provides several unique analytic frameworks that support such traceable designs. We provide an overview of these methods here in the hope that they will be more broadly adopted by the CI and HIT development communities. A summary of the most common Cognitive Engineering Analytic Frameworks and their key features is provided in Table 2. INSERT TABLE 2 HERE 4.1 Cognitive Work Analysis (CWA) CWA is one of the most comprehensive analytic frameworks for deriving display design guidance. It provides a structured cognitive engineering approach that guides identification of what information to collect and how to represent that information so as to inform design, providing traceable links from the results of analyses to the information content and form of the resulting displays and support systems. CWA encompasses a set of interlinked analyses and associated output representations (Rasmussen, 1986; Vicente, 1999; Bisantz & Burns, 2008; Roth & Bisantz, 2013). CWA analyses and representations include documenting the characteristics and constraints of the domain using a structured work domain model; the work objectives and methods using models of decision stages; the cognitive skills and strategy requirements; and the social and organizational influences. CWA methods provide the foundation for design of novel displays and decision aids, as well as other aspects of human system integration, including function allocation, team structure design, and training requirements specification (Burns & Hajdukiewicz, 2004; Naikar, 2009). They have been used in a variety of domains including medicine, military command and control, process control, and space operations. For instance, Nystrom, Williams, Paul, Graber, & Hemphill (2016) applied CWA work domain modeling methods to better understand the process of diagnostic decision making. 14
4.2 Ecological Interface Design (EID) EID is an interface design method that builds on CWA analyses, particularly work domain analyses (Burns & Hajdukiewicz, 2004; Vicente, 2002; Vicente & Rasmussen, 1992). EID displays and controls are designed to enable users to perform efficiently in routine situations while also being able to perform effectively in unanticipated conditions that require more effortful cognitive activities such as diagnosis, problem-solving and planning. Often EIDs make use of object or configurable graphics, where patterns or shapes “emerge” from the combination of object dimensions. The objects are designed such that object dimensions represent process parameters and the emergent features (such as size, alignment, or symmetry) enable ‘direct perception’ of critical indicators. (see Burns and Hadjukiewicz for a detailed methodological description). EID has been used to design innovative visualizations in a variety of domains including health monitoring, military command-and control, and process control. For example, McEwen, Flach, & Elder (2012) applied ecological interface design methods to create a user interface to support clinician decision making regarding a patient’s risk of cardiovascular disease (CVD). Their methods included in-depth investigations of relationships in the work domain (i.e., the patient) including understanding the mathematical forms of previously developed predictive models of CVD risk. The interface graphically illustrated the relationships among various physiological indicators (e.g., LDL and HDL levels; blood pressure) and CVD risk (the Reynolds Risk Score) so that the contributions of different variables affecting cardiac risk were presented visually. Physicians could manipulate variable values to visually demonstrate how different interventions (e.g., lowering blood pressure) would affect risk. The interface also provided visualizations of decision criteria for different treatment protocols. Sanderson and colleagues (Deschamps, Sanderson, et al. 2016; Sanderson et al., 2008; Watson & Sanderson, 2004, 2007) also applied EID and other cognitive engineering methods to create auditory displays of health related variables in domains such as neonatal ICU and anesthesiology. For anesthesiology, for instance, the researchers developed an extensive work domain model connecting human physiologic functions and variables. This model informed the design of auditory displays which integrated pitch, duration, and amplitude to support anesthetists’ monitoring of critical respiratory variables (i.e., respiratory volume, exhaled carbon dioxide, and respiration rate). These examples illustrate how EID 15
provides a principled approach for deriving requirements for the information to be presented and how it should be integrated and displayed for most effective support. 4.3 More Streamlined Approaches to Cognitive Work Analysis A number of more streamlined variants of CWA have been developed for deriving cognitive support requirements and implications for design of displays and decision aids. One example is Applied Cognitive Work Analysis (ACWA), which is tailored to design applications (Elm, Potter, Gualtieri, Roth & Easter, 2003; Elm, Gualtieri, et al., 2008). ACWA incorporates a series of design artifacts that provide traceable links starting from a work domain analysis, through the cognitive work requirements entailed, to the information needed to meet those cognitive work requirements, all the way through to the display elements and visual forms required to support the work requirements. ACWA has been applied to create innovative visualizations across a range of domains including military, intelligence analysis and cyber defense (Elm et al., 2008). Work-centered design is another streamlined variant of CWA that provides traceable links starting from the results of the cognitive analysis to the detailed display requirements (Eggleston, 2003; Evenson, Muller & Roth, 2008; Roth, DePass, Scott, Truxler, Smith, Wampler, in press). The results of the cognitive analysis are used to define cognitive work requirements. Cognitive work requirements take the form of questions a user must be able to answer ‘at a glance’ without needing to traverse multiple screens, perform mental calculations, or integrate disparate pieces of information (Wampler, Roth, Whitaker, Conrad, Stilson, Thomas-Meyers & Scott, 2006). Detailed information and display requirements are then specified to satisfy the cognitive work requirements. A simple example taken from a flight monitoring application illustrates the approach; observations and interviews revealed that flight monitoring personnel often missed that flight missions were planned to take off from or land at an airfield during ‘quiet hours’ when take-offs and landings are prohibited. To mitigate this problem, the formal analysis specified as cognitive work requirements the need for mission monitoring personnel to be able to answer the following questions ‘at a glance’ (with the display requirements needed to enable ‘at a glance’ answers provided in brackets): when are the quiet hours, if the airfield has them [Show quiet hour duration as a horizontal bar on a timeline]? How do the quiet hours correlate with the mission itinerary [Visually align 16
quiet hours duration with ground takeoff/landing events on the timeline so the user can perform a rapid visual comparison]? Is this mission legal with respect to quiet hours [Provide a prominent visual cue when quiet hours are violated]? By explicitly linking the results of the cognitive analysis to the information elements and display forms, it becomes easier to understand the rationale for particular display requirements, and make informed design tradeoffs when necessary. 4.4 Related Design Frameworks Decision-centered design (Militello & Klein, 2013) is another cognitive engineering design method grounded in analysis of work. It uses observation and interview methods to understand the cognitive requirements of the work and identify barriers and facilitators. The cognitive analysis is then used to derive explicit cognitive support requirements that are used to guide display design. For instance, Porat, Kostopoulou, Woolley, & Delaney (2016) used decision-centered design methods to develop design recommendations for a clinical decision-support system in primary care. Militello, Saleem, Borders, Susheraba, Haverkamp, Wolf, & Doebbeling (2016) adopted a decision-centered design approach to develop colorectal cancer screening decision support tool. The study used an iterative design process that included field observations, cognitive task analysis interviews, and focus groups to understand the work context in which colorectal cancer screening is performed and the barriers that limit the ability of primary care providers (and their patients) to make appropriate colorectal cancer screening decisions. The results of the analyses were synthesized into design artifacts that linked the major cognitive activities that contribute to colorectal cancer screening decisions, the barriers that interfere with effective performance, and the cognitive support requirements that an effective colorectal cancer screening decision-support system would need to be able to provide. Thus the use of decision-centered design enabled the authors to provide traceable links from the results of the cognitive analysis to the specific content and form of the resulting colorectal cancer screening decision-support system. A formal evaluation study provided empirical evidence of the value of using this cognitive engineering approach (Militello et. al, 2016). The evaluation study compared performance of physicians when they used the new application with their performance when using an information system currently in place. The study showed that the application improved performance in terms of accuracy in answering questions about 17
patients; the speed at which tasks were completed; and efficiency in finding necessary data. Subjective assessments of usability and workload were also improved. A somewhat different analysis method, Goal-Directed Task Analysis (GDTA) provides a structured approach to identify the goals and decisions associated with a job, the situation awareness requirements entailed, the information needed, and how it is combined to support the different elements of situation awareness. Thus GDTA provides another example of a principled approach for deriving traceable links between cognitive analysis and design (Endsley, Bolte, & Jones, 2003). GDTA has been used to develop display designs to support situation awareness and decision-making of individuals and teams (Kaber, Segall, Green, Entzian, & Junginger, 2006; Humphrey & Adams, 2011). A related framework, Contextual Design (Beyer & Holtzblatt, 1997) comes from the human-computer interaction design community, encompasses a number of activities that are similar to those described in the above methods, including gathering information about the work domain from observations and interviews, and documenting communication patterns, task steps, workplace layout, artifacts, and organizational factors through specific models. These include work flow models which, while not representing cognitive activities or work domain constraints, focus on communication and coordination activities among people. 4.5 Summary of Cognitive Engineering Approach To summarize, cognitive engineering methods provide a family of tools and techniques for deriving display and decision-support requirements, generating innovative designs tailored to the requirements of work, and performing user-in-the-loop evaluations that employ meaningful (user and use-centered) tasks and performance-metrics. While there is some overlap with methods commonly used in CI, such as focus groups, interviews, and field observations, there are also unique methods that have grown out of the cognitive engineering community that CI could profitably adopt. These include specific knowledge capture techniques such as the critical decision method as well as formal frameworks for analyzing cognitive demands and deriving cognitive support requirements such as Cognitive Work Analysis, Work Centered Design and Decision Centered Design. Among the benefits of cognitive engineering approaches include:
18
1. Providing a more complete picture of work demands and the activities of domain practitioners, thus reducing the possibility of a mismatch between the demands of the work and capabilities provided by the system; 2. Enabling analysis and design for both individuals and teams; 3. Providing traceable links between the cognitive and collaborative requirements uncovered through knowledge acquisition activities and the particular forms of support incorporated in the design (i.e., the information provided and how it is displayed); 4. Incorporating iterative, user-in-the-loop, evaluations that encourage development of systems that are more useable and useful, thus increasing the likelihood that they will be adopted. 5. Applying Cognitive Engineering to HIT Design: A Case Study in Emergency Medicine In hospital emergency departments (EDs), like other areas of health care, dry erase whiteboards used to track patient care are being replaced with electronic systems (e.g., an emergency department information systems or EDIS). As noted above, the electronic versions of these systems often do not support the same types of cognitive activities as the manual boards. Members of our research team conducted a multiphase project to address this design problem (Guarerra et al., 2015; McGeorge et al., 2015). We describe it here as a case study that both illustrates the cognitive engineering process (as depicted in Figure 1) and demonstrates the concrete performance improvements that result from adopting a cognitive engineering approach. More specifically, the project applied cognitive engineering data collection methods (i.e., artifact analysis, observations, interviews and focus groups) which were targeted at understanding ED goals, processes, and resources. The results were used to create abstraction hierarchy work domain models. As noted above, models of work domain goals, processes, and elements – separate from models of tasks or the mental models of experts – are a key differentiating component of cognitive engineering analyses. In particular, these models are useful in identifying the kinds of information needed to monitor the work system (and its components) to insure it is functioning appropriately, and to make decisions relevant to system control (e.g., regarding resource use and allocation, problem-solving, etc.). The work domain model 19
was then translated into information needs and interface concepts based on principles of Ecological Interface Design (see section 4, above). In our project, system monitoring and control translated to awareness of overall ED functioning, how resources were being used to care for patients, and identifying and resolving bottlenecks in patient care. The multiphased research proceeded as follows: 5.1 Gathering the Cognitive Needs of the User Initial research encompassed observations and analysis of a manual ED dry erase board used to track patients, their chief complaints, assigned clinicians, and care plan (Bisantz et al, 2010). We performed a detailed analysis of the information shown on the whiteboard, by extracting and then iteratively coding content from periodic photographs of the whiteboards. A similar analysis was performed on screen shots of computerized EDIS that replaced the manual whiteboards at the participating hospital. Results showed that information fields could contain information that varied widely in function. For instance, the “chief complaint” field included information about patient reported symptoms, clinical signs, hypothesized diagnoses, medical history, and mechanism of injury, among others. Similarly, the “comment” or “disposition” field included information regarding plans for care, predicted disposition, test or order status, and test results. There were similar categories of information represented across both the manual and electronic whiteboards, but the categories differed in frequency of use, due in part to the lack of flexibility in the electronic system to represent order status, and progress through the care plan. Subsequently, we conducted interviews and focus groups with ED personnel (physicians, nurses, technicians) that were tailored to elicit information needed to develop an ED work domain model. Our questions focused on the goals, functions, processes, and constraints in the ED work domain and associated information requirements. This contrasts with the types of questions that other user-centered approaches might focus on such as practitioner workflows, knowledge, and strategies. 5.2 Applying Cognitive Engineering Methods The resulting ED work domain model (shown in Figure 2) represented ED goals, constraints, processes, and physical system components and resources linked according to means-ends relationships as described in section 4 above (Guarerra et al., 2013). For instance, the model represented overall system goals or purposes such as “provide quality 20
care” subject to constraints or priorities such as those related the physical condition of the patient, available resources, and ethical care priorities (e.g., treating more critical patients first). Processes included those necessary to care for the patient (triaging, diagnosing, ordering, transporting, dispositioning, etc.); processes to maintain situation awareness over both patients (i.e., monitoring patient health status and bottlenecks in care) and over the overall ED in terms of capacity, resources, and throughput; and processes for communication and care coordination. System components (i.e. resources and objects) required for these processes included the patients themselves; personnel; information and communication systems; facilities and equipment; and guidelines, policies, and regulations. INSERT FIGURE 2 HERE Following processes outlined in Ecological Interface Design, key information elements and relationships associated with each element in the model were then identified by the research team. These elements included those needed to support overall awareness of the ED status (in terms of measures such as average length of stay, number of patients waiting) as well as awareness about patients, utilization of ED resources (e.g., bed spaces, staff) and their progress through the ED care process. For example, Figure 3 shows the overall ED summary display in which the values of eight different measures were integrated into a single object. These measures were based on identified information elements related to several nodes in the work domain model shown in Figure 2. For instance, measures of number of patients in the ED; number in the waiting room; and the percent of boarders can be used to understand demands (constraints and priority level) as well as the number of patients who may be admitted to the hospital (“gatekeeper for hospital” node). Time to first doctor, time to first medication, and average length of stay stem from the “time constraints” node, and pain level is related to “ethical treatment.” The measures also contribute to understanding the degree to which the ED function is “providing high quality care” and support overall situation awareness of ED functioning (at the general process level). 5.3 User-Centered Prototype Design
21
We then followed an iterative, user-centered design process to prototype and refine seven display areas which graphically represented: an overall summary of ED measures; information about patients in the waiting room; wait times and queue lengths for imaging and other tests; ED bed utilization and status (e.g., open, in use, requires housekeeping); trend information about the number of patients in various phases of care (e.g., in the waiting room, being evaluated, awaiting disposition); provider workload, and details about each patient. Heuristic evaluations (by teams of human factors engineers and subject matter experts) as well as usability tests (using ED physicians and nurses) were used to improve and finalize the display concepts (Clark et al., 2014; Guerrara et al., 2015). INSERT FIGURE 3 HERE The display concepts were evaluated in two ways. In one study, a human-in-the-loop experiment was conducted in a clinical simulation setting (McGeorge et al., 2015). Two person (physician and nurse) teams interacted with the EDIS displays and treated mannequin patients throughout a 45-minute scenario in which the displays were populated with dynamic data about simulated patients. Teams used either the new displays resulting from the cognitive engineering process, or displays that mimicked the current EDIS in use at their hospital. Participants rated the new displays significantly higher than the control displays in terms of cognitive support, and there was no significant increase in workload due to the novel nature of the new displays. Additionally, participants using the new displays showed improvements in situation awareness from the middle to the end of the experimental session. A separate study involved a structured usability assessment. Different types of health care providers (e.g., nurses, physicians, physician assistants) completed patient planning and orientation tasks using the displays and provided ratings of support for 19 different cognitive support objectives (i.e., identifying holdups in care; maintaining awareness of patient acuity) as well as ratings of the usability, usefulness, and likely frequency of use of specific display features. Overall, both nurse and provider participants gave high ratings regarding the degree to which the interface supports various cognitive objectives, and was usable, useful, and would be frequently used during an ED shift. The results from both studies
22
demonstrated the value in a cognitive engineering approach for identifying information elements necessary for successful system monitoring and decision-making. Our research team is currently implementing and testing these display concepts in a live clinical environment to demonstrate the utility of the methods described above in the care of patients. Overall, this case study provides a concrete illustration of the cognitive engineering approach to system design. It highlights the use of multiple, complementary knowledge acquisition methods, including artifact analysis, field observations, interviews and focus groups targeted at building work domain models; identification of information requirements from these structured cognitive engineering models; design of innovative visualizations based on principles of Ecological Interface Design; and iterative design and testing involving subject matter experts and potential users. The case study also exemplifies how cognitive engineering techniques can be applied to design of systems intended to support multi-disciplinary teams. Initial observations, interviews and focus groups elicited the perspectives and information needs of the multiple disciplines involved in ED patient care (including nurses, physicians, and physician assistants). A multi-discipline team perspective also informed display evaluation. The human-in-the-loop evaluations of the prototype displays employed twoperson teams (a physician and a nurse) functioning under realistic team performance conditions. The case study demonstrates that while there are additional logistic challenges, cognitive engineering methods can be readily extended to design of team displays and support systems.
6.0 Leveraging Informatics Methods to Enhance Cognitive Engineering Analyses Despite numerous examples of the application of cognitive engineering methods to HIT design, methods drawn from the field of informatics have not yet been leveraged as input to a cognitive engineering analysis. Due to resource constraints - both from the standpoint of access to subject matter experts, and available analysis time - cognitive engineering analyses are often based on relatively small samples of participants and observation sites. Informatics-inspired methods such as data mining, automatic text processing, and other data-analytic tools have the potential to dramatically expand the power of cognitive engineering analyses by enabling investigation of a larger corpus of data than is currently possible using traditional, labor-intensive data gathering methods 23
such as interviews and focus groups. Informatics methods can be also applied to smaller data sets but which still may be beyond the capacity of “human” analysis. For instance, while it is clear that automated data mining and pattern recognition techniques are necessary for data sets containing a million, or even a few thousand elements, they can also be usefully applied to much smaller data sets (i.e., searching a few hundred nursing notes to look for a key, safety critical phrase). Data analytic methods applied to automatically collected data can be used to identify patterns and relationships directly related to the dual goals of cognitive engineering (outlined in Section 2 above) - identifying challenging work domain demands and patterns of activities or system use which allow strategies to be inferred. They can also be used to identify system complexities that lead to errors or inefficiencies. In the following sections, we outline how a number of information related methods could be used to complement traditional cognitive engineering methods of knowledgegathering, in order to identify work domain demands, expert knowledge and strategies, artifact use, and communication strategies or breakdowns. 6.1 Automatic Text and Natural Language Processing Automatic text processing could be used to analyze HIT artifact use, in order to identify components of practitioner expertise (key to cognitive engineering analyses), such as experts’ patterns of system use which may indicate strategies in patient care, or methods to adapt to or work around system limitations. For example, most emergency department information systems allow entry of free text into a “comments” field used to communicate information, updates, and care plans associated with a specific patient; this data is typically recorded in the course of system use and often reflects communication between physicians and nurses. This corpus of data (which is much larger than could be gathered through direct observation of physician-nurse communication) could be analyzed to see how the content or function of the comments differed based on variables such as the type of provider, years of expertise, or the amount of time individuals on the team had worked together. Commonly occurring comments (i.e., requests for nonmedication or test orders) might indicate the need for additional system functionality. Similar natural language processing strategies could be applied to identify patterns of meaning in free text narratives about safety events in order to identify systematic problems, work-arounds, or applications of expertise that once required manual reviews 24
of patient charts and safety event reports (Fong et. al., 2015). While the use of selfreported patient safety events from frontline healthcare providers has long been an ingrained part of healthcare, safety related methods like Root Cause Analysis are traditionally applied only to a single incident (Hettinger et. al., 2013). Natural language processing could be leveraged to identify common patterns across a much larger number of safety events so as identify likely causes of safety events beyond ‘human error’ with actionable mitigation. Informatics methods can also be used to narrow in on a corpus of data for more in-depth manual analysis by using advanced searching techniques to narrow the number of patient safety reports that require manual human review, thus vastly expanding the information available for analysis compared to more traditional cognitive engineering methods that would typically involve either observations (which may catch only a few examples of a safety critical event) or interviews (in which individuals might discuss a safety problem, but which would not yield objective data). Recent pilot work by our team regarding look-alike/sound-alike medications errors adopted this hybrid approach. A common medication error involves mix ups among medications that look alike and/and or with names that sound alike (Lambert et. al., 1999). We used a list of high risk medications and automatically searched patient safety reports for text that mentioned at least one pair of medications. This narrowed the search pool of records from over 80,000 reports to less than 400 reports which could then be systematically reviewed human experts, who ultimately were able to provide important insights regarding how the medication ordering system could be redesigned for to support safer ordering of high risk medications. 6.2 Temporal Event Analysis Temporal analysis of large numbers of recorded events is another method that could be used to augment cognitive engineering methods, particularly for understanding practitioner expertise and work-related complexities in using information systems. Identifying temporal relationships among user actions or system inputs could allow comparisons of expert vs. novice strategies in system use; or identification of common problems or errors that reflect system challenges. For example, some electronic health records now collect user data to help identify individuals that may need training targeted at specific system features. The same approach could be used to compare strategies across different types of practitioners or levels of expertise. As one example logs of 25
electronic health record use could be used to understand physicians’ responses to different types of clinical reminders (Vashitz et al., 2009). Similar data could be leveraged to identify specific aspects of the HIT system that are not adequately supporting the cognitive needs of the user and would benefit from a cognitive engineering assessment and re-design. For instance, records of canceled electronic orders immediately followed by a new order could provide insight into problems with the HIT system or difficulties in team communication. Research by Adelman et al. has focused on finding specific patterns of wrong-patient order errors by looking for patterns of actions in a CPOE in which orders were entered for a patient, quickly retracted, and then quickly re-ordered on a patient of a different name (Adelman et. al. 2013). While this work was not specifically focused on cognitive engineering methods, it provides an excellent example of how informatics can be used in the service of cognitive engineering – in this case to find “error signatures” indicating poor design of the HIT system. In another example, internal safety work by our institution linked reported patient safety events to specific HIT interface design. The internal study sought to understand three near misses where healthy patients were inappropriately prescribed rectal instead of oral acetaminophen. To order 650mg of acetaminophen, the provider had to navigate a list of 18 items – the 3rd item on the list corresponded to the rectal suppository, while the 6th item was the more typical oral selection (Figure 4). While there is no way to know if all of these errors were caused by the relative placement of orders in the menu, temporal analyses could be applied to look at the number of times rectal orders were canceled and replaced with orders for oral medications, thus determining the in situ rate of wrong route orders in a CPOE system. Further, it would be possible to change the interface and see if the pattern of activities changed. 6.3. Integration of Data Collection Devices Because information methods allow analysis of large, complex data sets, they facilitate new research and data gathering methods which may have proved unwieldy with more manually intensive analysis. Eye-tracking is one data collection method which is facilitated by more automatic analytic techniques, and which can be valuable in understanding practitioner strategies in seeking and integrating information through sequential analysis of patterns of eye gazes. We used eye tracking to characterize information use strategies in the experimental emergency department information system 26
described above (Fong et al., 2016). This technique provided insight into how experienced users interacted with the prototype interface when performing complex cognitive tasks, such as comparing the acuity of simulated patients. Other new datagenerating technologies such as proximity badges can be deployed, for instance, to understand movement and possible communication patterns among hospital personnel over a large number of shifts. INSERT FIGURE 4 HERE 7.0 Conclusions Cognitive informatics and cognitive engineering share goals related to designing information systems which support the complex, situated work of users. Cognitive engineering provides structured data collection and analytic frameworks that illuminate two complementary aspects of work systems necessary for design: work domain complexities, and expert knowledge and strategies. There are a number of examples that demonstrate how cognitive engineering methods can be productively used in the design of HIT systems. Data analytic methods drawn from informatics provide a powerful new addition to cognitive engineering methods, which allow an understanding of work complexities, expert strategies, and system work-arounds based on a much larger corpus of data than is typically available to cognitive engineering analysts.
ACKNOWLEDGEMENTS This project was supported by grant number R01HS02254201A1 from the Agency for Healthcare Research and Quality. REFERENCES Adelman, J. S., Kalkut, G. E., Schechter, C. B., Weiss, J. M., Berger, M. A., Reissman, S. H., ... & Southern, W. N. (2013). Understanding and preventing wrong-patient electronic orders: a randomized controlled trial. Journal of the American Medical Informatics Association, 20(2), 305-310. doi: 10.1136/amiajnl-2012-001055 Ashoori, M., & Burns, C. (2013). Team cognitive work analysis structure and control tasks. Journal of Cognitive Engineering and Decision Making, 7(2), 123-140. Bauer, D., Guerlain, S. A., & Brown, P. J. (2006). Evaluating the use of flowsheets in pediatric intensive care to inform design. In Proceedings of the Human Factors and Ergonomics Society 50th Annual Meeting (pp. 1054–1058). Santa Monica, CA: Human Factors and Ergonomics Society. doi: 10.1177/154193120605001011 Baxter, G. D., Monk, A. F., Tan, K., Dear, P. R. F., & Newell, S. J. (2005). Using 27
cognitive task analysis to facilitate the integration of decision support systems into the neonatal intensive care unit. Artificial Intelligence in Medicine, 35, 243–257. doi: 10.1016/j.artmed.2005.01.004 Beuscart-zéphir, M., Pelayo, S., & Bernonville, S. (2009). Example of a Human Factors Engineering approach to a medication administration work system : Potential impact, 9, 43–57. doi:10.1016/j.ijmedinf.2009.07.002 Beyer, H., & Holtzblatt, K. (1997). Contextual design: defining customer-centered systems. Elsevier. Bisantz, A. M., & Burns, C. M. (Eds.). (2008). Applications of cognitive work analysis. CRC Press. Bisantz, A. M., Pennathur, P., Guarrera, T. K., Fairbanks, R. J., Perry, S. J., Zwemer, F., & Wears, R. L. (2010). Emergency department status boards: A case study in information systems transition. Journal of Cognitive Engineering and Decision Making, 4(1), 39–68. doi: 10.1518/155534310X495582 Bisantz, A., Roth, E., & Watts-Englert, J. (2015). Study and Analysis of Complex Cognitive Work. In J. Wilson & S. Sharples (Eds.), Evaluation of Human Work. CRC Press. Bowman, S. E., & Rogers, W. A. (2016). Understanding Decision Making Among Direct Care Workers in Assisted Living. Journal of Cognitive Engineering and Decision Making, 1555343416656952. Burns, C. M., & Hajdukiewicz, J. (2004). Ecological Interface Design. CRC Press. Christian, C. K., Gustafson, M. L., Roth, E. M., Sheridan, T., Gandhi, T. K., Dwyer, K., … Dierks, M. M. (2006). A prospective study of patient safety in the operating room. Surgery, 139(2), 159–173.doi: 10.1016/j.surg.2005.07.037 Clark, L. N., Guarrera, T. K., McGeorge, N. M., Hettinger, A. Z., Hernandez, A., LaVergne, D. T., … Bisantz, A. M. (2014). Usability evaluation and assessment of a novel emergency department IT system developed using a cognitive systems engineering approach. Proceedings of the International Symposium on Human Factors and Ergonomics in Health Care, 3(1), 76–80. doi: 10.1177/2327857914031011 Cook, R. I., & Woods, D. D. (1996). Adapting to new technology in the operating room. Human Factors, 38(4), 593–613. doi: 10.1518/001872096778827224 Crandall, B., Klein, G., & Hoffman, R. (2006). Working minds: A practitioner’s guide to cognitive task analysis. Mit Press Declerck, G., & Aimé, X. (2014). Reasons (not) to Spend a Few Billions More on EHRs: How Human Factors Research Can Help. Yearbook of medical informatics, 9(1), 90. doi: 10.1016/j.apergo.2016.03.013 -Deschamps, M., Sanderson, P., Hinckfuss, K., Browning, C., Loeb, R. G., Liley, H., & Liu, D. (2016). Improving the detectability of oxygen saturation level targets for preterm neonates : A laboratory test of tremolo and beacon soni fi cations. Applied Ergonomics, 56, 160–169 doi:10.1016/j.apergo.2016.03.013 Dierks, M. M., Christian, C. K., Roth, E. M., & Sheridan, T. B. (2004). Healthcare Safety: The Impact of Disabling “Safety” Protocols. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 34(6), 693–698. doi.org: 10.1109/TSMCA.2004.836785 Eggleston, R. G. (2003). Work-centered design: a cognitive engineering approach to system design. In Proceedings of the Human Factors and Ergonomics Society 47th 28
Annual Meeting (pp. 263–267). Santa Monica, CA: Human Factors and Ergonomics Society. doi: 10.1177/154193120304700303 Elm, W. C., Gualtieri, J. W., McKenna, B. P., Tittle, J. S., Peffer, J. E., Szymczak, S. S., & Grossman, J. B. (2008). Integrating cognitive systems engineering throughout the systems engineering process. Journal of Cognitive Engineering and Decision Making, 2(3), 249. doi: 10.1518/155534308X377108. Elm, W. C., Potter, S. S., Gualtieri, J. W., Easter, J. R., & Roth, E. M. (2003). Applied Cognitive Work Analysis: A Pragmatic Methodology for Designing Revolutionary Cognitive Affordances. In E. Hollnagel (Ed.), Handbook of Cognitive Task Design (pp. 357–382). Mahwah, NJ: Lawrence Erlbaum Associates. Endsley, M. R., Bolte, B., & Jones, D. G. (2003). Designing for Situation Awareness. Taylor and Francis. Endsley, M. R., Hoffman, R., Kaber, D., & Roth, E. (2007). Cognitive Engineering and Decision Making: An Overview and Future Course. Journal of Cognitive Engineering and Decision Making, 1(1), 1–21. doi: 10.1177/155534340700100101 Evenson, S., Muller, M., & Roth, E. M. (2008). Capturing the Context of Use to Inform System Design. Journal of Cognitive Engineering and Decision Making, 2(3), 181– 203. doi: 10.1518/155534308X377072 Flanagan, M. E., Saleem, J. J., Millitello, L. G., Russ, A. L., & Doebbeling, B. N. (2013). Paper- and computer-based workarounds to electronic health record use at three benchmark institutions. Journal of the American Medical Informatics Association, 20(e1), e59–66. doi: 10.1136/amiajnl-2012-000982 Fong, A., Hoffman, D., Hettinger, A. Z., Fairbanks, R. J., and Bisantz. A. M. (2016). Identifying visual search patterns in eye gaze data; gaining insights into physician visual workflow. Journal of the American Medical Informatics Association. Published online April 2016. doi: 10.1093/jamia/ocv196. Fong, A., Hettinger, A. Z., & Ratwani, R. M. (2015). Exploring methods for identifying related patient safety events using structured and unstructured data. Journal of Biomedical Informatics, 58, 89–95. doi: 10.1016/j.jbi.2015.09.011 Guarrera, T. K., Mcgeorge, N. M., Clark, L. N., Lavergne, D. T., Hettinger, Z. A., Fairbanks, R. J., & Bisantz, A. M. (2015). Cognitive Engineering Design of an Emergency Department Information System. In A. M. Bisantz, C. M. Burns, & R. J. Fairbanks (Eds.), Cognitive Engineering in Health Care (pp. 43 – 74). CRC Press. Guarrera, T., McGeorge, N., Stephens, R., Hettinger, A. Z., Clark, L., Hernandez, A., … Bisantz, A. (2013). Better Pairing of Providers and Tools: Development of an Emergency Department Information System using Cognitive Engineering Approaches. Proceedings of the International Symposium of Human Factors and Ergonomics in Healthcare, 2(1), 63–63. doi: 10.1177/2327857913021012 Gurses, A. P., Xiao, Y., & Hu, P. (2009). User-designed information tools to support communication and care coordination in a trauma hospital. Journal of Biomedical Informatics, 42(4), 667–677. doi: 10.1016/j.jbi.2009.03.007 Hettinger, A. Z., Fairbanks, R. J., Hegde, S., Rackoff, A. S., Wreathall, J., Lewis, V. L., … Wears, R. L. (2013). An evidence-based toolkit for the development of effective and sustainable root cause analysis system safety solutions. Journal of Healthcare Risk Management, 33(2), 11–20. doi: 10.1002/jhrm.21122 Hoffman, R., & Lintern, G. (2006). Eliciting and representing the knowledge of experts. In K. A. Ericsson, N. Charness, P. Feltovich, & R. Hoffman (Eds.), Cambridge 29
Handbook of Expertise and Expert Performance (pp. 203–222). Cambridge University Press. Hoffman, R. R., Coffey, J. W., & Ford, K. M. (2000). A case study in the research paradigm of human-centered computing: Local expertise in weather forecasting. Unpublished Technical Report, National Imagery and Mapping Agency. Hu, Y.-Y., Arriaga, A. F., Peyre, S. E., Corso, K. A., Roth, E. M., & Greenberg, C. C. (2012). Deconstructing intraoperative communication failures. Journal of Surgical Research, 177(1), 37–42. doi: 10.1016/j.jss.2012.04.029 Humphrey, C., & Adams, J. (2011). Analysis of complex team-based systems: augmentations to goal-directed task analysis and cognitive work analysis. Theoretical Issues in Ergonomics. doi: 10.1080/14639221003602473 Johnson, C. M., Johnson, T. R., & Zhang, J. (2005). A user-centered framework for redesigning health care interfaces, 38, 75–87. doi: 10.1016/j.jbi.2004.11.005 Kaber, D. B., Segall, N., Green, R. S., Entzian, K., & Junginger, S. (2006). Using multiple cognitive task analysis methods for supervisory control interface design in high-throughput biological screening processes. Cognition, Technology & Work, 8(4), 237–252. doi: 10.1007/s10111-006-0029-9 Klein, G. A. (1998). Sources of Power: How people make decisions. MIT Press. Klein, G., Armstrong, A., Woods, D., Gokulachandra, M., & Klein, H. A. (2000). Cognitive wavelength: The role of common ground in distributed replanning. Prepared for AFRL/HECA, Wright Patterson AFB. Klein, G. A., Calderwood, R., & MacGregor, D. (1989). Critical decision method for eliciting knowledge. Systems, Man and Cybernetics, IEEE Transactions on, 19(3), 462–472. doi: 10.1109/21.31053 Klinger, D. W., & Hahn, B. B. (2005). Team decision requirement exercise: Making team decision requirements explicit. Handbook on Human Factors and Ergonomics Methods, 495-501. CRC Press. Koch, S. H., Weir, C., Haar, M., Staggers, N., & Agutter, J. (2012). Intensive care unit nurses ’ information needs and recommendations for integrated displays to improve nurses ’ situation awareness. doi: 10.1136/amiajnl-2011-000678 Lambert, B. L., Lin, S. J., Chang, K. Y., & Gandhi, S. K. (1999). Similarity as a risk factor in drug-name confusion errors: the look-alike (orthographic) and sound-alike (phonetic) model. Medical care, 37(12), 1214-1225. Lee, J. D., & Kirlik, A. (2013). The Oxford handbook of cognitive engineering. Oxford University Press. McEwen, T., Flach, J., & Elder, N. (2012, July). Ecological interface for assessing cardiac disease. In ASME 2012 11th Biennial Conference on Engineering Systems Design and Analysis (pp. 881-888). American Society of Mechanical Engineers. doi:10.1115/ESDA2012-82974 McGeorge, N., Hegde, S., Berg, R. L., Guarrera-Schick, T. K., Lavergne, D. T., Casucci, S. N., … Bisantz, A. M. (2015). Assessment of Innovative Emergency Department Information Displays in a Clinical Simulation Center. Journal of Cognitive Engineering and Decision Making, 9(4), 39 – 346. doi: 10.1177/1555343415613723 Militello, L. G., & Hutton, R. J. B. (1998). Applied Cognitive Task Analysis (ACTA): A practitioners toolkit for understanding task demands. Ergonomics, 41(11), 1618– 1641. doi: 10.1080/001401398186108 Militello, L. G., Saleem, J. J., Borders, M. R., Sushereba, C. E., Haverkamp, D., Wolf, S. 30
P., & Doebbeling, B. N. (2016). Designing Colorectal Cancer Screening Decision Support: A Cognitive Engineering Enterprise. Journal of Cognitive Engineering and Decision Making, 10(1), 74–90. doi: 10.1177/1555343416630875 Militello, L., & Klein, G. (2013). Decision-centered design. The Oxford Handbook of Cognitive. Oxford University Press. Miller, A., & Militello, L. (2015). The role of cognitive engineering in improving clinical decision support. In A. Bisantz, C. Burns, & R. Fairbanks (Eds.), Cognitive Engineering in Health Care (pp. 7–26). CRC Press. Miller, A., & Xiao, Y. (2007). Multi-level strategies to achieve resilience for an organisation operating at capacity: a case study at a trauma centre. Cognition, Technology & Work, 9(2), 51-66. Mumaw, R. J., Roth, E. M., Vicente, K. J., & Burns, C. M. (2000). There is more to monitoring a nuclear power plant than meets the eye. Human Factors, 42(1), 36–55. doi: 10.1518/001872000779656651 Naikar, N. (2009). Beyond the design of ecological interfaces: Applications of work domain analysis and control task analysis to the evaluation of design proposals, team design, and training. Applications of cognitive work analysis, 15-47. Nemeth, C., Anders, S., Strouse, R., Grome, A., Crandall, B., Pamplin, J., … MannSlinas, E. (2016). Developing a Cognitive and Communications Tool for Burn Intensive Care Unit Clinicians. Military Medicine, 2015–13. doi: 10.7205/MILMEDD-15-00173 Norman, D. A., & Draper, S. W. (1986). Cognitive engineering. User Centered System Design, 31–61. L. Erlbaum Associates Inc. Nystrom, D. T., Williams, L., Paull, D. E., Graber, M. L., & Hemphill, R. R. (2016). A Theory-Integrated Model of Medical Diagnosis. Journal of Cognitive Engineering and Decision Making, 1555343415618965. Papautsky, E. L., Crandall, B., Grome, A., & Greenberg, J. M. (2015). A Case Study of Source Triangulation Using Artifacts as Knowledge Elicitation Tools in Healthcare Space Design. Journal of Cognitive Engineering and Decision Making, 9(4), 347358 Parush, A. (2015). Displays for health care teams: A conceptual framework and design methodology. Cognitive Systems Engineering in Health Care. CRC Press. Patel, V. L., & Kannampallil, T. G. (2015). Cognitive informatics in biomedicine and healthcare. Journal of Biomedical Informatics. doi: 10.1016/j.jbi.2014.12.007 Patterson, E. S., Cook, R. I., & Render, M. L. (2002). Improving patient safety by identifying side effects from introducing bar coding in medication administration. Journal of the American Medical Informatics Association, 9, 540–553. doi: 10.1197/jamia.M1061 Patterson, M. D., Militello, L. G., Bunger, A., Taylor, R. G., Wheeler, D. S., Klein, G., & Geis, G. L. (2016). Leveraging the Critical Decision Method to Develop SimulationBased Training for Early Recognition of Sepsis. Journal of Cognitive Engineering and Decision Making, 10(1), 36-56. Patterson, E. S., Rogers, M. L., Tomolo, A. M., Wears, R. L., & Tsevat, J. (2010). Comparison of extent of use, information accuracy, and functions for manual and electronic patient status boards. International Journal of Medical Informatics, 79(12), 817–823. doi: 10.1016/j.ijmedinf.2010.08.002 Peute, L. W., Aarts, J., Bakker, P. J. M., & Jaspers, M. W. M. (2009). Anatomy of a 31
failure : A sociotechnical evaluation of a laboratory physician order entry system implementation. International Journal of Medical Informatics, 79(4), e58–e70. doi: 10.1016/j.ijmedinf.2009.06.008 Pew, R., & Mavor, A. (2007). Human-system integration in the system development process: A new look. National Academies Press. Pfautz, J. D., & Pfautz, S. L. (2009). Methods for the analysis of social and organizational aspects of the work domain. Applications of cognitive work analysis, 175-228. Porat, T., Kostopoulou, O., Woolley, A., & Delaney, B. C. (2016). Eliciting User Decision Requirements for Designing Computerized Diagnostic Support for Family Physicians. Journal of Cognitive Engineering and Decision Making,10(1), 57-73. Rasmussen, J. (1986). Information Processing and Human-Machine Interaction: An Approach to Cognitive Engineering. Elsevier Science Inc. Roth, E., & Bisantz, A. (2013). Cognitive work analysis. In The Oxford handbook of cognitive engineering (pp. 240–260). Oxford University Press. Roth, E. M., DePass, B., Scott, R., Truxler, R., Smith, S. Wampler, J. (in preparation). Designing collaborative planning systems: Putting Joint Cognitive Systems Principles to Practice. In P. Smith and R. R. Hoffman (Eds). Cognitive Systems Engineering: The Future for a Changing World. Ashgate Publishing Limited Roth, E., & Eggleston, R. (2010). Forging new evaluation paradigms: Beyond statistical generalization. In Macrocognition Metrics and Scenarios (pp. 204–219). Roth, E. M., Multer, J., & Raslear, T. (2006). Shared situation awareness as a contributor to high reliability performance in railroad operations. Organization Studies, 27(7), 967–987. doi: 10.1177/0170840606065705 Roth, E. M., & Patterson, E. S. (2005). Using observational study as a tool for discovery: Uncovering cognitive and collaborative demands and adaptive strategies. In H. Montgomery, R. Lipshitz, & B. Brehmer (Eds.), How professionals make decisions (pp. 379–393). Lawrence Erlbaum Associates. Roth, E. M., Patterson, E. S., & Mumaw, R. J. (2002). Cognitive Engineering: Issues in User-centered System Design. In J. J. Marciniak (Ed.), Encyclopedia of Software Engineering (Second, pp. 163–179). Wiley Interscience, John Wiley and Sons. Roth, E. M., Scott, R., Deutsch, S., Kuper, S., Schmidt, V., Stilson, M., & Wampler, J. (2006). Evolvable work-centered support systems for command and control: Creating systems users can adapt to meet changing demands. Ergonomics, 49(7), 688–705. doi: 10.1080/00140130600612556 Saleem, J. J., Flanagan, M. E., Russ, A. L., McMullen, C. K., Elli, L., Russell, S. A., … Wilck, N. (2014). You and me and the computer makes three: variations in exam room use of the electronic health record. Journal of the American Medical Informatics Association : JAMIA, 21(e1), e147–51. doi: 10.1136/amiajnl-2013002189 Saleem, J. J., Patterson, E. S., Militello, L., Render, M. L., Orshansky, G., Asch, S. M., … Karson, A. (2005). Exploring barriers and facilitators to the use of computerized clinical reminders. Journal of the American Medical Informatics Association : JAMIA, 12(4), 438–47. doi: 10.1197/jamia.M1777 Sanderson, P. M. and Fisher, M. (1996) Exploratory Sequential Data Analysis: Foundations. Human Computer Interaction, 9, 251 - 317. Sanderson, P. M., Watson, M. O., Russell, W. J., Liu, D., Green, N., Llewelyn, K., … 32
Krupenia, S. (2008). Advanced Auditory Displays and Head-Mounted Displays: Advantages and Disadvantages for Monitory by the Distracted Anesthesiologist. Anesthesia & Analgesia, 106(6), 1787–1797. doi: 10.1213/ane.0b013e31817325cb Seagull, F. J., & Sanderson, P. M. (2001). Anesthesia Alarms in Context: An Observational Study. Human Factors, 43(1), 66–78. doi: 10.1518/001872001775992453 Seagull, F. J., & Xiao, Y. (2001). Using eye-tracking video data to augment knowledge elicitation in cognitive task analysis. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 45, No. 4, pp. 400-403). Shachak, A., Hadas-Dayagi, M., Ziv, A., & Reis, S. (2009). Primary Care Physicians’ Use of an Electronic Medical Record System: A Cognitive Task Analysis. Journal of General Internal Medicine, 24(3), 341–348. doi: 10.1007/s11606-008-0892-6 Smith, S. W., Koppel, R. (2014). Healthcare information technology’s relativity problems: a typology of how patients' physical reality, clinicians' mental models, and healthcare information technology differ. Journal of the American Medical Informatics Association : JAMIA, 21(1), 117–31. doi: 10.1136/amiajnl-2012-001419 Thyvalikakath, T. P., Dziabiak, M. P., Johnson, R., Torres-Urquidy, M. H., Acharya, A., Yabes, J., & Schleyer, T. K. (2014). Advancing cognitive engineering methods to support user interface design for electronic health records. International Journal of Medical Informatics, 83(4), 292–302. doi: 10.1016/j.ijmedinf.2014.01.007 Vashitz, G., Meyer, J., Parmet, Y., Peleg, R., Goldfarb, D., Porath, A., & Gilutz, H. (2009). Defining and measuring physicians’ responses to clinical reminders. Journal of Biomedical Informatics, 42(2), 317–326. doi: 10.1016/j.jbi.2008.10.001 Vicente, K. J. (1999). Cognitive Work Analysis. CRC Press. Vicente, K. J. (2002). Ecological interface design: Progress and challenges.Human Factors: The Journal of the Human Factors and Ergonomics Society,44(1), 62-78 Vicente, K. J., & Rasmussen, J. (1992). Ecological interface design: Theoretical foundations. IEEE Transactions on Systems, Man, and Cybernetics, SMC-22, 589– 606. doi: 10.1109/21.156574 Vicente, K. J., Roth, E. M., & Mumaw, R. J. (2001). How do operators monitor a complex, dynamic work domain? The impact of control room technology. International Journal of Human-Computer Studies, 54(6), 831–856. doi: 10.1006/ijhc.2001.0463 Wampler, J., Roth, E., Whitaker, R., Conrad, K., Stilson, M., Thomas-Meyers, G., & Scott, R. (2006). Using Work-Centered Specifications to Integrate Cognitive Requirements into Software Development. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 50(3), 240–244. doi: 10.1177/154193120605000307 Wang, Y. (2008). The theoretical framework of cognitive informatics. In Int. J. Cognitive Inf. Natural Intell (1st ed., pp. 1–27). Watson, M. O., & Sanderson, P. M. (2007). Desgining for Attention with Sound: Challenges and Extensions to Ecological Interface Design. Human Factors, 49(2), 331 – 346. doi: 10.1518/001872007X312531 Watson, M., & Sanderson, P. M. (2004). Sonification Supports Eyes-free Respiratory Monitory and Task Time-Sharing. Human Factors, 46(3), 497 – 517. doi: 10.1518/hfes.46.3.497.50401 Wears, R. L., Perry, S. J., Wilson, S., Galliers, J., & Fone, J. (2007). Emergency 33
department status boards: user-evolved artefacts for inter- and intra-group coordination. Cognition, Technology & Work, 9(3), 163–170. doi: 10.1007/s10111006-0055-7 Woods, D. D. (1993). Process tracing methods for the study of cognition outside of the experimental psychology laboratory. In G. A. Klein, J. Orasanu, R. Calderwood, & C. E. Zsambok (Eds.), Decision-making in action: Models and Methods (pp. 228– 251). Ablex Publishers. Woods, D. D., & Dekker, S. W. A. (2000). Anticipating the effects of technological change: a new era of dynamics for Human Factors. Theoretical Issues in Ergonomics Science, 1, 272–282. doi: 10.1080/14639220110037452 Woods, D. D., & Roth, E. M. (1988). Cognitive Engineering: Human Problem Solving with Tools. Human Factors: The Journal of the Human Factors and Ergonomics Society, 30(4), 415–430. doi: 10.1177/001872088803000404 Wright, A., Hickman, T. T. T., McEvoy, D., Aaron, S., Ai, A., Andersen, J. M., ... & Bates, D. W. (2016). Analysis of clinical decision support system malfunctions: a case series and survey. Journal of the American Medical Informatics Association. doi: http://dx.doi.org/10.1093/jamia/ocw005 Xiao, Y. (2005). Artifacts and collaborative work in healthcare: methodological, theoretical, and technological implications of the tangible. Journal of Biomedical Informatics, 38, 26–33. doi:10.1016/j.jbi.2004.11.004 Xiao, Y., Schenkel, S., Faraj, S., Mackenzie, C. F., & Moss, J. (2007). What Whiteboards in a Trauma Center Operating Suite Can Teach Us About Emergency Department Communication. Annals of Emergency Medicine, 50(4), 387–395. doi: 10.1016/j.annemergmed.2007.03.027
34
Table 1: Summary of Common Cognitive Engineering Knowledge Gathering Techniques Common CE Approaches to Knowledge Capture Interviews and Focus Groups
Brief Description
Notes
Semi-structured interviews with single individuals or multi-person groups
Used broadly to include input from Subject Matter Experts
Critical Decision Method Structured, multi-pass (CDM) interviewing technique
Focuses on past experiences making critical decisions to elicit decision challenges and elements of expertise
Applied Cognitive Task Analysis (ACTA)
Structured, multi-pass interviewing technique
Focuses on tasks and related knowledge and skills
Concept Mapping
Graphical technique for representing relationships (links) between concepts (nodes)
Can be created in real time during an interview or focus group to support group consensus or used post-interview to synthesize domain knowledge
Observations
Observations (often unstructured) of work in real contexts or high fidelity simulations to identify complexities, challenges, work-arounds, and actual work practice. Analysis of the role that external aids (both high tech and low tech) play in work practice.
May be combined with other continuous data sources to create a process trace of unfolding events
Artifact Analysis
35
Artifacts can be both formal (e.g., paper forms, IT systems) or practitioner generated (e.g., ‘cheat sheets’, sticky-notes).
Table 2 Summary of Cognitive Engineering Analytic Frameworks Analytic Frameworks Cognitive Work Analysis (CWA)
Brief Description Five phase method encompassing analysis of the work domain, control tasks, strategies, organizational structure, and required knowledge and skills.
Key Feature Used in design of control interfaces that can support unanticipated as well as proceduralized work
Ecological Interface Design (EID)
Interface design method which uses CWA outputs to create designs which support multiple levels of cognitive control.
Often uses object or configural display components
Applied Cognitive Work Analysis (ACWA)
Streamlined CWA method that produces work domain analysis, cognitive work requirements, information needs, and design elements
Uses specific artifacts that allow design elements to be traced back through the analysis
Work Centered Design
The results of cognitive analyses are used to define cognitive work requirements that are then translated to display and decision support requirements. Cognitive analyses are conducted to identify important decisions, cognitive requirements, barriers and facilitators. Results are used to derive explicit cognitive support requirements that guide display design Structured approach to identifying information needs that decomposes tasks into goals and decisions Observations, interviews and artifact analyses are used to document communication patterns, task steps, workplace layout, artifacts, and organizational factors through specific models.
Cognitive work requirements take the form of questions that a user must be able to answer ‘at a glance’. CDM is often used to identify the critical decisions, barriers and facilitators.
Decision Centered Design
Goal-Directed Task Analysis
Contextual Design
36
Includes a focus on information needed to support situation awareness This analysis framework comes from the humancomputer interaction design community,
Figure 1
Figure 1. A schematic depiction of the cognitive engineering approach to user-support
emphasizing the iterative-process nature of the stages along with common work products.
37
Figure 2
Figure 2. Condensed work domain model of the emergency department. Information requirements associated with shaded nodes were represented in the final set of displays. See Guarerra et al., 2015 for details. Copyright Ann Bisantz, 2015.
38
Figure 3
Figure 3: One of the seven ED displays created, showing an overall ED summary. If all variables were in the goal range, the display appeared as a green octagon; red or yellow spikes indicated variables that were out of range. See Guarerra et al., 2015 for details. Copyright Ann Bisantz, 2015.
39
Figure 4 325mg, Soln-Oral, PO, One Time, STAT , ED ONLY 120mg, Supp, PR, One Time, STAT , ED ONLY 650mg, Supp, PR, One Time, STAT , ED ONLY 325mg, Tab, PO, One Time, STAT , ED ONLY 500mg, Tab, PO, One Time, STAT , ED ONLY 650mg, Tab, PO, One Time, STAT , ED ONLY 1000mg, Tab, PO, One Time, STAT , ED ONLY 1000mg, Inj, IVPB, One Time, Indication: Other One time dose 325mg, Soln-Oral, PO, q6h PRN, pain/fever/headache, Indication: Other pain/fever/headache 650mg, Soln-Oral, PO, q6h PRN, pain/fever/headache, Indication: Other pain/fever/headache 325mg, Supp, PR, q6h PRN, pain/fever/headache, Indication: Other pain/fever/headache 650mg, Supp, PR, q6h PRN, pain/fever/headache, Indication: Other pain/fever/headache 325mg, Tab, PO, q4h PRN, pain/fever/headache, Indication: Other pain/fever/headache 650mg, Tab, PO, q4h PRN, pain/fever/headache, Indication: Other pain/fever/headache 650mg, Tab, PO, q4h PRN, pain/fever/headache, Indication: Other pain/fever/headache 650mg, Tab, PO, q6h PRN, pain/fever/headache, Indication: Other pain/fever/headache 650mg, Tab, PO, q6h PRN, pain/fever/headache, Indication: Other pain/fever/headache 650mg, Tab, PO, One Time, STAT , ED ONLY
Figure 4: Order statements when selecting acetaminophen in a computerized provider order entry system
40
Graphical abstract
41
Highlights 1. 2. 3. 4.
Cognitive engineering (CE) has roots in cognitive science and engineering. It provides structured, analytic methods for data collection and analysis. CE intersect with and complements methods of cognitive informatics. Health information technology systems can benefit from CE informed design.
42