Clinical decision-making in complex healthcare delivery systems

Clinical decision-making in complex healthcare delivery systems

Chapter 122 Clinical decision-making in complex healthcare delivery systems Kristen E. Millera,b, Hardeep Singhc, Ryan Arnoldd, Gary Kleine a Nation...

164KB Sizes 0 Downloads 39 Views

Chapter 122

Clinical decision-making in complex healthcare delivery systems Kristen E. Millera,b, Hardeep Singhc, Ryan Arnoldd, Gary Kleine a

National Center for Human Factors in Healthcare, MedStar Health, Washington, DC, United States, Georgetown University School of Medicine, Washington, DC, United States, cMichael E. DeBakey Veterans Affairs Medical Center, Baylor College of Medicine, Houston, TX, United States, dDrexel University College of Medicine, Philadelphia, PA, United States, eShadowbox, LLC, Dayton, OH, United States b

Situation Clinical decision-making is a complex process involving information processing, evaluation of evidence, and application of relevant knowledge to select the appropriate interventions that provide high-quality care and reduce risk of patient harm (Standing, 2007). Decision-making can range from fast, intuitive, or heuristic to carefully reasoned, analytical, or evidence-based. Clinical decision-making is a contextual, continuous, and evolving process and there is a significant amount of research dedicated to understanding best processes for making decisions that provide better outcomes for any given situation based on evidence and experience (Tiffen et al., 2014). Challenges to clinical ­decision-making include the burden of exponentially expanding clinical knowledge, the timeliness and comprehensiveness of available data, as well as care and choice complexity. Most decision researchers believe that specialties characterized by a high degree of time pressure, data uncertainty, stress, and distracters have the highest incidence of errors (Graber et al., 2012). Decision science within medicine must account for uncertainty and the unknowable when assessing for errors in judgment. Some processes of care are amenable to rigorous decision support with the use of definitive targets and well-prescribed outcomes. Examples include stand-alone decision support systems (simple model-oriented systems based on powerful mainframe computers) that date back to 1959 (Ledley and Lusted, 1959) and checklists for central venous catheter placement to optimize a sterile environment in order to prevent catheterrelated infections (Gawande, 2010). This stands in contrast to variability in clinical diagnosis, diagnostic testing strategies, or treatment approaches for various clinical conditions 858

or diseases in which there is no unified consensus as to optimal approach. Clinical decision-making involves (Standing, 2007) a balance of experience, awareness, knowledge, and information gathering, (Tiffen et al., 2014) the use of appropriate assessment tools and technologies including dynamic interactions with colleagues and patients, and (Graber et al., 2012) the use of evidence-based practice to guide effective decisions that improve patient outcomes and health. To address the increasing complexity of decision-making in the modern era, health care requires new methods and strategies to improve decision-making, prevent errors, and enhance health and health care.

Background Historically, a variety of analytical and intuitive conceptual models have sought to explain the cognitive processes underlying how decisions are made in order to improve clinical practice. These multidimensional models involve the interplay between knowledge of preexisting pathological conditions, patient information, clinical care, and experiential learning and include elements of critical thinking skills, reflection, clinical judgment, and problem solving (Tiffen et al., 2014). Such models include hypothetico-deductive reasoning, Rasmussen’s skill, rule, and knowledge (SRK) framework of ­decision-making, and naturalistic decision-making (NDM). Under these model frameworks, humans apply heuristics, mental models, and sensemaking under variability and complexity. One of the most influential models of ­decision-making is that of hypothetico-deductive reasoning (Dowding and Thompson, 2004; Elstein et  al., 1978) which suggests that individuals go through a series of stages when ­processing ­information to make a judgment or diagnosis Clinical Engineering Handbook. https://doi.org/10.1016/B978-0-12-813467-2.00123-1 Copyright © 2020 Elsevier Inc. All rights reserved.

Clinical decision-making in complex healthcare delivery systems Chapter | 122  859

(Dowie, 1993). The first stage (cue acquisition) is the gathering of clinical information about the patient. Following the collection of information, hypotheses are generated which provide a possible explanation for the information; the information collected is then interpreted in the light of the hypotheses, before a hypothesis is chosen that is favored by the majority of the evidence or information. At this point decision makers may choose to collect more information if they feel that none of the original hypotheses fit the data. Jens Rasmussen’s SRK model (Fig.  1) describes three different levels of cognitive activity during task performance and decision-making (Rasmussen, 1983). The skill-based level describes aspects of performance and ­decision-making that are at the subconscious level using stored patterns of preprogrammed actions; here humans act out of habit without conscious thoughts. People who usually make skill-based decisions are very experienced with the task at hand. The rule-based level describes when people are familiar enough with the task but do not have enough experience and will look for cues or rules that they may recognize from a past experience to make a decision. These rules are accumulated via experience and training. The knowledge-based level exists when the task at hand is novel and when people do not have any rules stored from past experiences. People will resort to analytical processing using conceptual information which involves problem definition, solution generation, and determining the best course of action or planning before making a decision.

NDM is defined as “the way people use their experience to make decisions in field settings.” (Gordon et al., 1998) In the real-world environment, tasks that involve ­decision-making tend to have the following characteristics: ill-structured problems; uncertain, dynamic environments; information-rich environments where situational cues may change rapidly; cognitive processing that proceeds in iterative action/feedback loops; multiple shifting and/or competing individual and organizational goals; time constraints; high risk; and multiple team members involved in the decision (Klein et al., 1993). Gary Klein’s recognition primed decision (RPD) model is one example of an NDM model (Klein et al., 1986; Klein, 1989). It describes how decision makers can recognize a plausible course of action as the first one to consider (Fig.  2). This model has two main phases. The first is situation recognition in which a person encountering a challenging condition may notice that it matches a familiar pattern. That recognition match provides the person with guidance about relevant cues to monitor, expectancies about what should happen next, plausible goals to pursue, and reasonable courses of action. The second phase is serial option evaluation which involves considering the reasonable courses of action one at a time, ready to implement the first one that is acceptable. The option evaluation relies on mental simulation to see if an option will be ­workable or not in the current situation. In most cases, a decision maker only needs to consider a single option. The RPD model is a blend of Goals

Knowledgebased behavior

Rule-based behavior

Symbols Identification

Decision of task

Planning

Recognition

Association state/task

Stored rules for tasks

Signs

Skill-based behavior Feature formation

Sensory input FIG. 1  SRK model (Rasmussen, 1983).

(Signs)

Automated sensorsmotor patterns

Signals

Actions

860  SECTION | 13  Introduction to human factors

Experience the situation in a changing context

Seek more information

No

Is the situation familiar?

Reassess situation

Yes

Recognition has four aspects:

Yes

Are expectancies violated?

Goals

Cues

Expectancies

Actions 1..N

No

Mental simulation of action (N)

Modify

Yes, but

Will it work?

No

Yes Implement

FIG. 2  Recognition primed decision model (Klein et al., 1993).

i­ntuition and analysis. Pattern matching is the intuitive part and mental simulation is the conscious, deliberate, and analytical part (Klein, 2008). It is estimated that we make 35,000 remotely conscious decisions each day. These decisions are influenced by decision strategies, styles, and inclinations which include factors such as pattern recognition and cognitive heuristics. A decision strategy is the platform for making choices that moves the clinician toward a certain goal. These subconscious and conscious strategies include but are not limited to impulsiveness, compliance, delegating, avoidance/deflection, balancing, prioritizing/reflecting. Clinicians often make decisions including diagnosis by “pattern recognition” using compiled knowledge based on reading, experience, and expertise. A pattern-matching approach makes extensive use of cognitive shortcuts in place of statistical logic. Expert diagnostic reasoning is based on recognition of key or pivotal findings, refinement of hypotheses as more information is learned, early diagnostic hypothesis formation, and quasi-probabilistic reasoning using prevalence. Heuristics are central to discussions about d­ecisionmaking. Heuristics are seen as “cognitive shortcuts,” rules of thumb, or simply information problem-solving methods such as trial and error that lead quickly to solutions. In 2002, the psychologist Daniel Kahneman won a Nobel prize for

his research (with Amos Tversky) to systematically identify and characterize human decision behaviors. They used the term heuristics to describe decision behaviors as cognitive shortcuts used preferentially to reduce the cognitive cost of decision-making (Kahneman, 2003). Experts are seldom conscious of the heuristic cognitive pathways they use to make decisions. Using rules of thumb may help make quick decisions and Kahneman and Tversky noted how valuable the cognitive heuristics were. Their research also showed that it was possible to create conditions under which the heuristics led to suboptimal and even incorrect answers, thus acting in these conditions as a source of error bias. Mental models in human thinking and reasoning explain a clinician’s thought process and serve as a representation of the world and the relationship between concepts within it. Models help us better organize information to accomplish the following: to clarify concepts and propose relationships; to provide a context for interpreting the study findings; to explain observations; to make research findings meaningful and generalizable; to stimulate research and the extension of knowledge by providing both direction and impetus. According to Jakob Nielsen, a mental model is based on belief, not facts; that is, it is a model of what users know (or think they know) about a system (Nielsen, 2010). To generate mental models, we “chunk” away a massive but finite

Clinical decision-making in complex healthcare delivery systems Chapter | 122  861

amount of fundamental, unchanging knowledge that can be used for evaluating the infinite number of unique scenarios that show up in the real world. We recall hundreds and thousands of these models daily to better understand our current environment and to quickly analyze and solve the problem within a given context. Originally described by Karl Weick, sensemaking is the process of providing structure to the unknown and requires a person to place individual stimuli or data into a framework (Weick, 1995). The sensemaking approach argues that making a decision first requires an effort to understand an ongoing event, and that such an effort involves initial and evolving impressions, dynamic feedback, and attention shifting to identify and decipher pieces of information. Clinical sensemaking involves the assimilation of multiple streams of information and observations, testing hypotheses drawn from experience, using intuition to resolve gaps in knowledge, and accommodating (not necessarily resolving) ambiguity. The data frame theory, an expansion of NDM, postulates that all sensemaking activities are focused on bringing the structure to presented data (Klein et al., 2007). Data need to fit within a framework and the data and frame form a complex to help make sense of the world. Data-frame sensemaking presents the mental construction of the frame guiding the interpretation of data (symptoms, signs, signals, information) while at the same time the interpretation of the data guides the selection, modification, and application of the frame, in order to size up complex, ambiguous, and dynamic conditions.

Assessment Under various models using cognitive processes, humans make medical decisions but it is these same cognitive processes that can occasionally set us up for cognitive errors contributing to delayed or incorrect diagnosis or treatment. Clinicians are expected to access, appraise, and incorporate research evidence into their professional judgment all under the influence of cognitive, emotional, cultural, and environmental factors (Stiegler and Ruskin, 2012). Cognitive influences like judgment and decision heuristics paired with clinical strategies based on experience and expertise can impact situational awareness (SA). Together, these processes enable impressive feats of diagnosis in the face of ambiguity, complexity, and time pressure. However, cognitive performance can never be perfect and the healthcare community is concerned with the way these processes, especially the heuristics may contribute to error (Kohn et al., 2000), unexplained practice variability (Brook et al., 2000; Reid et al., 2010; Schuster et al., 1998), and guideline noncompliance (McGlynn et al., 2003; Driskell et al., 2012). Cognitive biases refer to the human tendency to make systematic errors based on the use of heuristics rather than relying on evidence and on analytical reasoning s­trategies such as a

probability theory and Bayesian statistics. The cognitive heuristics clinicians use are quite valuable, and attempts to have clinicians forego heuristics in favor of completely analytical judgment and decision-making would be counterproductive, if not disastrous. Therefore, a challenge in health care is to find an effective way to inject the moving stream of clinical evidence into the everyday ­decision-making of the practitioners. SA is considered a predominant concern in systems operation, based on a descriptive view of decision-making. Mica Endsley defines SA as “knowing what’s going on” (Endsley, 1995). But understanding is more than information gathering. It implies gathering the right information (all that is needed, but not too much), being able to analyze it, and making projections based on the analysis. Applied to clinical decision-making, the “right” information can be patient-centric or population-centric. A patient-centric SA would incorporate an individual patient’s response to a treatment in the context of all their coexisting medical conditions, such as in impaired immune system after chemotherapy exposure. A population-centric SA would incorporate trends in endemic diseases as data is interpreted, such as evaluating a respiratory illness during influenza season. In the best of all worlds, it also means being able to make effective use of the information. SA is expressed as “high” or “low” and is a precursor to decision-making. Endsely has described SA as consisting of three levels: Level 1 is the perception of the elements in the environment within a volume of time and space, Level 2 is the comprehension of their meaning, and Level 3 is the projection of their status in the near future. Clinicians need to develop expertise to respond to challenging patient health problems, new technologies, and complex healthcare environments (AL-Dossary et al., 2014). Experts are known for their efficient and intuitive decision-making processes while novices are known for more effortful and deliberate processes. Novices tend to use analytical models, characterized as being more structured, slow, and often based on only a partial view of the total situation. In contrast, more experienced clinicians use more intuitive models, recognizing patterns, and generating quick actions to solve complex problems (Benner, 1984; Bjork and Hamilton, 2011). The transition from novice to expertise is characterized by vast repertoire of pattern recognition and motor programming, stored in memory and the transition is marked by a gradual “off-loading” of control from effortful, self-conscious thought to effortless, implicit processes (Kahneman and Klein, 2009). Research demonstrates that experts reason more efficiently than novices. They have a greater store of compiled knowledge and array of strategic approaches as well as awareness of diagnostic “weight of evidence” in hypothesis formation. Errors in patient diagnosis can lead to delayed or incorrect testing or treatment and patient harm. Because

862  SECTION | 13  Introduction to human factors

c­ linicians make decisions in real-world environments which are often time pressured and chaotic, diagnostic errors have both cognitive and systems origins. The act of diagnosis is a decision made about the patient’s presenting problem. The incidence of diagnostic error varies with estimate concluding diagnosis is wrong 10%–15% of the time (Higgs and Elstein, 1995). Outpatient estimates in aggregate estimate diagnostic errors of 5.08% or approximately 12 million US adults annually (Singh et  al., 2014). Researchers estimate that about half of these errors could potentially be harmful. In some estimates, more than two-thirds of missed or delayed diagnoses are caused in part by cognitive errors in decision-making (Singh et  al., 2012). Research suggests healthcare organizations should integrate decision support tools and bolster error-detection approaches such as trigger tools, voluntary or prompted reports from patients, and physician error reports, to learn how clinical decision-making processes unfold in these situations (Graber et  al., 2005; Bhise et al., 2018).

Recommendation(s) Rapid advancements in research, scientific discovery, and health information technology (HIT) are generating innovative decision-making strategies and technologies to support clinician behavior. Solutions must be collaboratively developed across stakeholders to address key challenges and accelerate the availability of reliable information resources that are smoothly and affordably incorporated with patient-centered, clinician-friendly workflows. However, clinical decision-making in modern, patient-centered, HIT enable environments requires us to reimagine and rethink ­decision-making in this complexity. Additional factors related to decision-making must be considered including diagnostic uncertainty and lack of knowledge. Diagnostic uncertainty is defined as a “subjective perception of an inability to provide an accurate explanation of the patient’s health problem.” Simply put, there are some clinical presentations in which the underlying disease is unknowable using current technologies. Methodological advancements in measuring diagnostic uncertainty can improve our understanding of diagnostic decision-making and inform interventions to reduce diagnostic errors and overuse of healthcare resources (Bhise et al., 2018). There is increasing recognition of the need for a better understanding of medical decision-making, and how to incorporate known limitations inherent to human cognition into the solutions while also preserving the considerable strengths that experienced clinicians bring with them and helping to accelerate expertise. When designed appropriately, HIT interventions can improve patient outcomes (Chaudhry et al., 2006). Clinical trigger tools, which identify deteriorating patients, can provide clinical decision support (CDS) with knowledge and person-specific

information, intelligently filtered, and presented at appropriate times, to enhance healthcare delivery (Osheroff et  al., 2007). Algorithms and clinical guidelines can help clinicians organize the way they assemble and interpret information. At least, that is the opportunity created by such tools. Nevertheless, the healthcare community also needs to be mindful of the difficulties of making HIT tools work in practice. When CDS is applied effectively, it has been shown to enhance health outcomes, prevent adverse events, improve efficiency, reduce costs, and boost provider and patient satisfaction (Umscheid et  al., 2015). However, when inappropriately applied, CDS can inappropriately simplify a complex presentation, appear to provide clear direction but may narrow one’s focus too quickly, limiting some thoughts and options that otherwise would have been considered. Clinicians can also turn to publically available tools like ePocrates, UpTo-Date, DynaMed, EvidenceCare, and Visual DX in order to access synthesized evidence databases (Alper et al., 2005). Although patients are far more informed than they were 20 or 30 years ago, some patients express frustration and dissatisfaction with their care because they do not feel like they have adequate (if any) input into the decisions that clinicians are making about their health care. There is considerable evidence that many patients want more information and greater involvement in decision-making in partnership with their doctors (Deber et  al., 1996). Shared decisionmaking is a key component of patient-centered health care. Patients and their families increasingly expect to be given information on the patient’s condition and treatment options and want the clinical team to take their individual preferences into account, reflecting the patient’s life circumstances, socioeconomic status, health insurance coverage, work schedule, support structure, and religious and cultural preferences. Complicating the ­ decision-making process is the fact that some decisions about preventive testing, diagnostic workups, and treatment options are driven by physicians’ preferences (shaped by medical training, local norms, and personal experience) rather than scientific evidence. As a result, we see tremendous and ­well-documented variations in care. The concept of shared or negotiated decision-making, with the patient as an active partner, brings with it issues of informed consent and comprehensible risk communication. Training and education of healthcare providers are imperative to meeting the increasing challenges and demands of today’s healthcare system. The “ShadowBox” training approach allows novices to “see the world through the eyes of the experts” without the experts being present (Klein and Borders, 2016). The approach can build the cognitive and perceptual skills needed for adaptive d­ ecision-making. ShadowBox presents challenging scenarios and intersperses decision points at which the participant is asked to rank a given set of options. These may be o­ ptions about

Clinical decision-making in complex healthcare delivery systems Chapter | 122  863

which course of action to choose, which goal to prioritize, which cues to monitor carefully, or which pieces of information to gather. Their responses are compared to those of experts recorded earlier. Once the participant provides rankings and rationale, he/she sees what the panel of experts ranked, and sees the experts’ rationale, thereby noticing what he/she had missed. Through this approach, participants see how experts would react and learn what the experts were thinking about so that novices begin to see situations like the experts would. This reflection fosters the development of insights that will aid in the evolution of novice to expert thinking and identification of potential for improvement in the clinical decision-making process in a learner-centered manner.

References AL-Dossary, R., Panagiota, K., Maddoxb, P.J., 2014. The impact of residency programs on new nurse graduates’ clinical decision-making and leadership skills: a systematic review. Nurse Educ. 34 (6), 1024–1028. Alper, B.S., White, D.S., Ge, B., 2005. Physicians answer more clinical questions and change clinical decisions more often with synthesized evidence: a randomized trial in primary care. Ann. Fam. Med. 3, 507–513. Benner, P., 1984. From Novice to Expert: Excellence and Power in Clinical Nursing Practice. Addison-Wesley Publishing Company, Menlo Park, CA. Bhise, V., Rajan, S.S., Sittig, D., et al., 2018. Defining and measuring diagnostic uncertainty in medicine: a systematic review. J. Gen. Intern. Med. 33 (1), 103–115. Bjork, I.T., Hamilton, G.A., 2011. Clinical decision making of nurses working in hospital settings. Nurs. Res. Pract. 1–8. Brook, R.H., McGlynn, E.A., Shekelle, P.G., 2000. Defining and measuring quality of care: a perspective from US researchers. International J. Qual. Health Care 12, 281–295. Chaudhry, B., Wang, J., Wu, S., et  al., 2006. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann. Intern. Med. 144, 742–752. Deber, R.B., Kraetschmer, N., Irvine, J., 1996. What role do patients wish to play in treatment decision making? Arch. Intern. Med. 156 (13), 1414–1420. Dowding, D., Thompson, C., 2004. Using judgment to improve accuracy in decision making. Nurs. Times 100 (22), 42. Dowie, J., 1993. Clinical decision analysis: background and introduction. In: Llewelyn, H., Hopkins, A. (Eds.), Analysing How We Reach Clinical Decisions. Royal College of Physicians, London. Driskell, O.J., Holland, D., Hanna, F.W., et  al., 2012. Inappropriate requesting of glycated hemoglobin (HbA1c) is widespread: assessment of prevalence, impact of national guidance, and practice-to-practice variability. Clin. Chem. 58, 906–915. Elstein, A.S., Shulman, L., Sprafka, S., 1978. Medical Problem Solving: An Analysis of Clinical Reasoning. Harvard University Press, Cambridge, MA. Endsley, M.R., 1995. Toward a theory of situation awareness in dynamic systems. Hum. Factors 37, 32–64. Gawande, A., 2010. Checklist Manifesto: How to Get Things Right. Metropolitan Books, New York.

Gordon, S.E., Liu, Y., Wickens, C.D., 1998. An Introduction to Human Factors Engineering. Longman, New York. Graber, M.L., Franklin, N., Gordon, R., 2005. Diagnostic error in internal medicine. Arch. Intern. Med. 165, 1493–1499. Graber, M.L., Kissam, S., Payne, V.L., et  al., 2012. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual. Saf. 21, 535–557. Higgs, J., Elstein, A., 1995. Clinical reasoning in medicine. In: Higgs, J. (Ed.), Clinical Reasoning in the Health Professions. ButterworthHeinemann Ltd, Oxford, pp. 49–59. Kahneman, D., 2003. A perspective on judgment and choice: mapping bounded rationality. Am. Psychol. 58, 697–720. Kahneman, D., Klein, G., 2009. Conditions for intuitive expertise: a failure to disagree. Am. Psychol. 64 (6), 515–526. Klein, G., 1989. Recognition-primed decisions. In: Rouse, W.B. (Ed.), Advances in Man-Machine Systems Research. JAI Press, Inc., Greenwich, CT, pp. 47–92. Klein, G., 2008. Naturalistic decision making. Hum. Factors 50 (3), 456–460. Klein, G., Borders, J., 2016. The ShadowBox approach to cognitive skills training. J. Cogn. Eng. Decis. Mak. 10 (3), 268–280. Klein, G., Calderwood, R., Clinton-Cirocco, A., 1986. Rapid decision making on the fire ground. In: Proceedings of the Human Factors and Ergonomics Society 30th Annual Meeting. vol. 1 pp. 576–580. Klein, G., Orasanu, J., Calderwood, R., Zsambok, C., 1993. Decision Making in Action: Models and Methods. Ablex Publishing, Westport, CT. Klein, G., Phillips, J.K., Rall, E.L., Peluso, D.A., 2007. A data-frame theory of sense-making. In: Expertise Out of Context: Proceedings of the Sixth International Conference on Naturalistic Decision Making, pp. 113–155. Kohn, L.T., Corrigan, J., Donaldson, M.S., 2000. To Err Is Human: Building a Safer Health System. National Academy Press, Washington, DC. Ledley, R.S., Lusted, L.B., 1959. Reasoning foundations of medical diagnosis; symbolic logic, probability, and value theory aid our understanding of how physicians reason. Science 130 (3366), 9–21. McGlynn, E.A., Asch, S.M., Adams, J., et al., 2003. The quality of health care delivered to adults in the United States. N. Engl. J. Med. 348, 2635–2645. Nielsen, J., 2010. Mental Models. Nielsen Norman Group. Osheroff, J.A., Teich, J.M., Middleton, B., et al., 2007. A roadmap for national action on clinical decision support. J. Am. Med. Inform. Assoc. 14, 141–145. Rasmussen, J., 1983. Skill, rules, knowledge: signals, signs and symbols and other distinctions in human performance models. IEEE Trans. Syst. Man Cybern. 3, 257–266. Reid, R.O., Friedberg, M.W., Adams, J.L., et al., 2010. Associations between physician characteristics and quality of care. Arch. Intern. Med. 170, 1442–1449. Schuster, M.A., McGlynn, E.A., Brook, R.H., 1998. How good is the quality of health care in the United States? Milbank Q. 76, 517–563. Singh, H., Giardina, T., Forjuoh, S., et al., 2012. Electronic health recordbased surveillance of diagnostic errors in primary care. BMJ Qual. Saf. 21 (2), 93–100. Singh, H., Meyer, A., Thomas, E., 2014. The frequency of diagnostic errors in outpatient care: estimations from there large observational studies involving US adult populations. BMJ Qual. Saf. 23 (9), 727–731. Standing, M., 2007. Clinical decision-making skills on the developmental journey from student to registered nurse: a longitudinal inquiry. J. Adv. Nurs. 60 (3), 257–269.

864  SECTION | 13  Introduction to human factors

Stiegler, M.P., Ruskin, K.J., 2012. Decision-making and safety in anesthesiology. Curr. Opin. Anaesthesiol. 25, 724–729. Tiffen, J., Corbridge, S., Slimmer, L., 2014. Enhancing clinical decision making: development of a contiguous definition and conceptual framework. J. Prof. Nurs. 20 (5), 399–405.

Umscheid, C.A., Betesh, J., VanZandbergen, C., et  al., 2015. Development, implementation, and impact of an automated early warning and response system for sepsis. J. Hosp. Med. 10 (1), 26–31. Weick, K.E., 1995. Sense-Making in Organizations. Sage.