ARTICLE IN PRESS
Social Science & Medicine 66 (2008) 403–413 www.elsevier.com/locate/socscimed
The meaning of justice in safety incident reporting Bryan Jeffrey Weinera,, Cherri Hobgoodb, Megan A. Lewisc a
Department of Health Policy and Administration, University of North Carolina at Chapel Hill, 1102-C McGavran-Greenberg Hall, Chapel Hill, NC 27510, USA b Office of Educational Development, School of Medicine, 322 MacNeider Building, CB 7530, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-7530, USA c RTI International, 3040 Cornwallis Road, P.O. Box 12194, Research Triangle Park, NC 27709-2194, USA Available online 18 October 2007
Abstract Safety experts contend that to make incident reporting work, healthcare organizations must establish a ‘‘just’’ culture— that is, an organizational context in which health professionals feel assured that they will receive fair treatment when they report safety incidents. Although healthcare leaders have expressed keen interest in establishing a just culture in their institutions, the patient safety literature offers little guidance as to what the term ‘‘just culture’’ really means or how one goes about creating a just culture. Moreover, the safety literature does not indicate what constitutes a just incident reporting process in the eyes of the health professionals who provide direct patient care. This gap is unfortunate, for knowing what constitutes a just incident reporting process in the eyes of front-line health professionals is essential for designing useful information systems to detect, monitor, and correct safety problems. In this article, we seek to clarify the conceptual meaning of just culture and identify the attributes of incident reporting processes that make such systems just in the eyes of health professionals. To accomplish these aims, we draw upon organizational justice theory and research to develop a conceptual model of perceived justice in incident reporting processes. This model could assist those healthcare leaders interested in creating a just culture by clarifying the multiple meanings, antecedents, and consequences of justice. r 2007 Elsevier Ltd. All rights reserved. Keywords: Patient safety; Just culture; Incident reporting; Organizational justice
Introduction To improve patient safety, healthcare organizations are implementing incident reporting systems in order to collect information directly from health professionals about near misses, medical errors, and Corresponding author. Tel.: +1 919 966 7375;
fax: +1 919 966 6961. E-mail addresses:
[email protected],
[email protected] (B.J. Weiner),
[email protected] (C. Hobgood),
[email protected] (M.A. Lewis). 0277-9536/$ - see front matter r 2007 Elsevier Ltd. All rights reserved. doi:10.1016/j.socscimed.2007.08.013
adverse events—hereafter referred to generically as safety incidents (Barach & Small, 2000; Wald & Shojania, 2001). Obtaining such information is considered crucial for identifying risky situations, analyzing underlying causes, taking corrective action, and implementing prevention efforts (Institute of Medicine, 2000). Those directly involved in patient care are said to possess important safetyrelated information that cannot be obtained through retrospective peer review or computerized surveillance systems (Barach & Small, 2000; Olsen et al., 2007; O’Neil et al., 1993; Reason,
ARTICLE IN PRESS 404
B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
2000; Wald & Shojania, 2001; Welsh, Pedot, & Anderson, 1996). Although healthcare organizations have expended substantial effort to promote incident reporting, studies suggest that underreporting is pervasive (Cullen et al., 1995; Kopp, Erstad, Allen, Theodorou, & Priestley, 2006; Wald & Shojania, 2001). Many observers attribute underreporting to the punitive (‘‘name and blame’’) approach that many healthcare organizations have taken with regard to safety incidents. By inculcating a sense of fear, the punitive approach discourages reporting and, in doing so, prevents organizational learning and improvement (Barach & Small, 2000; Blegen et al., 2004; Kadzielski & Martin, 2002; Kingston, Evans, Smith, & Berry, 2004; Manasse, Eturnbull, & Diamond, 2002; Wakefield et al., 2001, 1999). By comparison, the ‘‘non-punitive’’ approach that the airline industry has taken with regard to incident reporting is seen as a significant contributing factor to the industry’s impressive safety record (Marx, 2001; Reason, 2000). To make incident reporting work, safety experts contend, healthcare organizations must establish a ‘‘just’’ culture—that is, an organizational context in which health professionals feel assured that they will receive fair treatment when they report safety incidents (Beyea, 2004; Institute of Medicine, 2003; Kizer, 1999; Marx, 2001). In the United States, healthcare leaders have expressed considerable interest in establishing a just culture in their institutions (Agency for Healthcare Research and Quality, 2005; Emery Center on Health Outcomes in Quality, 2004; No Author, 2002; O’Leary, 2003; University of Michigan Medical School, 2005). Yet, many are grappling with the questions of what the term ‘‘just culture’’ really means and how one goes about creating a just culture (Beyea, 2004). At present, the patient safety literature provides little guidance on either issue. Importantly, the patient safety literature does not indicate what constitutes a just (or fair) incident reporting process in the eyes of health professionals who work on the ‘‘sharp end’’ of the delivery system. This lacuna is unfortunate, since incident reporting processes work only if those on the front-lines perceive the design and operation of such processes as just (or fair). This is true for both voluntary and mandatory incident reporting systems (Barach & Small, 2000; Reason, 1997). Knowing what constitutes a just incident reporting process in the eyes of health professionals is therefore essential for designing useful information
systems for detecting, monitoring, and correcting safety problems. In this article, we seek to clarify the conceptual meaning of just culture and identify the attributes of incident reporting processes that make such systems just (or fair) in the eyes of health professionals. To accomplish these aims, we draw upon organizational justice theory and research to develop a conceptual model of perceived justice in incident reporting processes. This model could assist those healthcare leaders interested in creating a ‘‘just culture’’ by clarifying the meaning, antecedents, and consequences of justice. The concept of a just culture A just culture is seen by some experts as an integral aspect of a broader culture of safety (Institute of Medicine, 2003; Kizer, 1999). Indeed, Reason (1997) considers it the foundation of a culture of safety. Surprisingly, despite the importance ascribed to it, no concise definition of just culture exists. The more general term ‘‘organizational culture’’ refers to the shared pattern of beliefs, assumptions, and expectations that are held by organizational members and that shape their interaction with each other and with stakeholders outside the organization (Bowditch & Buono, 2001). A just culture, then, is one in which the beliefs, assumptions, and expectations that govern behavior in an organization conform to generally held principles of moral conduct. Although the term ‘‘just culture’’ can be construed broadly, the term is often more narrowly used to refer to the beliefs, assumptions, and expectations that govern accountability and discipline for unsafe acts (e.g., near misses, medical errors, and adverse events). A just culture, expert say, is a ‘‘non-punitive’’ environment in which individuals can report errors or close calls without fear of reprimand, rebuke, or reprisal (Blegen et al., 2004; Karadeniz & Cakmakci, 2002; Kingston et al., 2004; Pizzi, Goldfarb, & Nash, 2001; Wakefield et al., 1999; Wild & Bradley, 2005). At the same time, they assert, a just culture is not an environment wherein no accountability exists (Beyea, 2004). Failing to discipline those who commit unsafe acts due to incompetence or recklessness is just as much a violation of widely accepted moral principles as is punishing those who commit honest mistakes. A just culture, therefore, stands between a ‘‘blaming’’ or punitive culture, on the one hand, and a
ARTICLE IN PRESS B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
‘‘no-blame’’ or ‘‘anything-goes’’ culture, on the other. This view reflects the connotation of balance typically associated with the terms ‘‘just’’ or ‘‘fair.’’ In a just culture, safety experts say, health professionals feel assured that they will receive fair treatment when they report safety incidents (Beyea, 2004; Institute of Medicine, 2003; Kizer, 1999; Marx, 2001). The expectation of fair treatment is important because it engenders a sense of trust, or as Edmondson (2002) calls it, psychological safety. Reporting safety-related information creates a sense of uncertainty and vulnerability, both for those implicated in an incident and for those who report it. Establishing a work environment in which people perceive that they will receive fair treatment enhances trust by providing some degree of predictability about the process and the outcome that follows incident reporting, as well as some assurance about the intentions and motives of those involved in incident investigation, resolution, and remediation. Research in healthcare and other industries indicates that people are reluctant to report safety incidents or other organizational problems if they perceive that doing so exposes them to retaliation, ostracism, or other unjustified negative consequences (Edmondson, 2002; Elder, Graham, Brandt, & Hickner, 2007; Jeffe et al., 2004; Milliken, Morrison, & Hewlin, 2003). Conversely, organizational members are more likely to report information about problems, issues, and concerns if they feel that they will be supported and protected from undeserved (i.e., unfair) treatment by managers, peers, and subordinates (Ashford, Rothbard, Piderit, & Dutton, 1998; Dutton, Ashford, Lawrence, & Miner-Rubino, 2002; Dutton, Ashford, Oneill, Hayes, & Wierba, 1997). A just culture, in sum, seeks to balance the need to learn from mistakes and the need to take disciplinary action where appropriate (Marx, 2001). A just culture supports and encourages incident reporting by reinforcing core tenets of a culture of safety, namely that: (a) human error is inevitable, even in the best organizations; (b) errors do not occur randomly, but rather display recurrent patterns; (c) latent conditions, which represent ‘‘resident pathogens’’ in work systems, generate recurrent error patterns; and (d) effective risk management and organizational improvement depend on open communication about safety incidents (Institute of Medicine, 2003; Kizer, 1999). At the same time, a just culture recognizes that, depending on the circumstances, protecting patients
405
from future harm sometimes requires disciplinary action. Creating a just culture: some unmet needs Although the idea of a just culture has been endorsed by the US Institute of Medicine and other proponents of patient safety (Agency for Healthcare Research and Quality, 2005; Institute of Medicine, 2003; Marx, 2001; Reason, 2000), the safety literature offers little guidance on how to create a just culture. The safety climate literature, for example, has focused primarily on the question of whether safety compliance, defined as ‘‘adherence to safety procedures and carrying out work in a safe manner’’ (Clarke, 2006; Neal, Griffin, & Hart, 2000, p. 101), is associated with safety performance. Studies in this area indicate that employees’ shared perceptions of safety policies and procedures (i.e., safety climate) influence safety compliance which, in turn, affects employee injuries, unsafe acts, and accident involvement (Clarke, 2006). Although less studied, evidence suggests that employees’ shared perceptions of safety policies and procedures also influences their level of safety participation, defined as ‘‘helping coworkers, promoting the safety program within the workplace, and putting effort into improving patient safety’’ (Clarke, 2006, p. 101; Neal et al., 2000). In a recent study, Naveh, KatzNavon, and Stern (2006) found that physicians’ and nurses’ shared perceptions of safety procedures were positively associated with their willingness to report treatment errors. However, the study focused on physicians’ and nurses’ shared perceptions of the relevance and suitability of safety procedures for their departments’ routine work flow, not their shared perceptions of the justice (or fairness) of the safety procedures. Undoubtedly a connection exists between safety climate and just culture. To date, however, safety climate research has not theoretically articulated or empirically explored this connection. Writing in a more prescriptive bent, safety experts have offered a few guiding principles and ‘‘lessons learned’’ from other industries about how to create a just culture. For example, Reason (1997, 2000) and Marx (2001) have emphasized the importance of establishing general agreement on where to draw the line between culpable and non-culpable unsafe acts. Likewise, others have stressed the importance of confidentiality in error reporting processes and rapid action in response to reported events (Barach
ARTICLE IN PRESS 406
B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
& Small, 2000). These, and other principles and lessons, represent valuable initial contributions to the knowledge base. However, three limitations exist that create a significant knowledge gap and practical need. First, as noted earlier, the term ‘‘just culture’’ has not been clearly defined or fully articulated by safety experts. Theory and research in other social science fields; however, suggest that justice is a multidimensional construct. Establishing agreement on where to draw the line between culpable and nonculpable unsafe acts, for example, is only one aspect of justice (namely, distributive justice). It may not even be the most important aspect in the eyes of health professionals. This issue is not a trivial one. Conceptual ambiguity about what constitutes justice not only stymies meaningful dialog among safety experts and healthcare professionals, but also thwarts organizational efforts to build a just culture. Second, the current knowledge base lacks detail about what constitutes a just incident reporting process, especially from the viewpoint of health professionals directly providing care. Justice is a perceived attribute, not an objective feature of decision-making processes (Emery Center on Health Outcomes in Quality, 2004; Marx, 2001; Singer et al., 2003). Creating a just culture depends on knowing not only what attributes define a just incident reporting process in the eyes of health professionals, but also whether health professionals value some attributes more than others. Such information is vital for the design and operation of robust incident reporting processes. Finally, an underlying assumption exists in the patient safety literature that a just incident reporting process and, by implication, a just culture, looks the same for all health professionals. It may be the case, however, that health professionals hold different views on what constitutes a just incident reporting process depending on the culture of the profession or the specialty into which they were socialized. To the extent that systematic variation exists, a ‘‘one-size fits all’’ approach to creating a just culture may not work. In the sections that follow, we develop a conceptual model of justice in incident reporting processes that draws upon the extensive body of theory and research on organizational justice found in the general management literature. This conceptual model could help guide empirical research on error reporting and patient safety, as well as assist practical efforts to create a just culture in healthcare organizations.
Organizational justice Organizational justice refers to people’s perceptions of fairness in the workplace (Greenberg, 1987). Informed by social psychology, the organizational justice literature encompasses hundreds of studies examining the meaning, antecedents, and consequences of justice in a variety of workplace settings (e.g., schools, manufacturing plants, service firms, and joint ventures), across a wide range of decisionmaking contexts (e.g., performance evaluation, strategy implementation, recruiting, pay raises, promotions, and layoffs), and through both experimental and non-experimental methods (CohenCharash & Spector, 2001; Colquitt, Greenberg, & Zapata-Phelan, 2005; Viswesvaran & Ones, 2002). This extensive body of research supports the following claims. First, justice is a matter of perception. People may differ in what they consider fair. [Justice scholars use the terms ‘‘justice’’ and ‘‘fairness’’ interchangeably.] Second, justice is a pervasive concern in work place settings. As Colquitt and his colleagues note, people value organizational justice for a variety of reasons: Perceptions of fairness reinforce the perceived trustworthiness of authorities, reducing fears of exploitation while providing an incentive to cooperate with one’s coworkers. On a more personal level, fairness satisfies several individual needs, such as the need for control and the needs for esteem and belonging (Collquit et al., 2005, pp. 5–6). Third, justice is a multidimensional construct. People care not only about the fairness of the outcomes that they receive, but also about the fairness of the procedures used to decide the outcomes. Moreover, people care about the way they are treated by decision makers (i.e., the social aspects of decision-making processes). Finally, organizational justice matters. That is, people’s perceptions of justice affect their attitudes and behavior with respect to a variety of personal, social, and organizational outcomes (Cohen-Charash & Spector, 2001; Colquitt, Conlon, Wesson, Porter, & Ng, 2001; Viswesvaran & Ones, 2002). Justice scholars have developed several theories to explain why justice matters. These theories can be grouped into three broad perspectives (Gillespie & Greenberg, 2005). The instrumental perspective holds that people care about justice because it provides control over outcomes, which serve to
ARTICLE IN PRESS B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
maximize the favorability of outcomes received. For example, social exchange theory holds that people maintain the norm of reciprocity in order to accrue and maintain valued benefits (Blader & Tyler, 2005). The relational perspective suggests that people care about justice because it enhances their feelings of self-worth and acceptance by others. For example, the group engagement model suggests that people derive a sense of pride, respect, and belongingness based on the justice they experience in organization (Blader & Tyler, 2005; Tyler & Blader, 2003). Finally, the moral virtues perspective proposes that people care about justice because it provides basic respect for human dignity and worth. The moral virtues model, for example, holds that people view justice as a moral issue and seek to belong to, work on behalf of, or disassociate from organizations that they see as respecting this moral imperative (Cropanzano, Byrne, Bobocel, & Rupp, 2001a, 2001b). Justice scholars have noted that these broad explanatory perspectives (and specific theories that they encompass) are not mutually exclusive and may, in fact, be complementary (Blader & Tyler, 2005). Organizational justice applied to incident reporting The figure below depicts a conceptual model on the role of justice in incident reporting. The model offers a distillation and ‘‘translation’’ of current organizational justice theory and research as it applies, or could apply, to incident reporting and patient safety. Briefly, the model posits that health professionals’ perceptions of justice depend on three characteristics of incident reporting processes: distribution rules, procedural rules, and social treat-
Incident Reporting Process • Distribution Rules • Procedural Rules • Social Aspects
Justice Perceptions • Distributive Justice • Procedural Justice • Interpersonal Justice* • Informational Justice*
407
ment. Justice perceptions, in turn, generate a variety of affective and behavioral reactions, some of which relate directly to patient safety (e.g., employee silence-underreporting), while others relate more indirectly and have broader organizational implications (e.g., trust in supervisor). As the dashed lines indicate, professional and specialty culture might or might not influence health professionals’ sense of what constitutes a fair incident reporting process. Exploring this uncertainty is important in future research. Justice perceptions: Organizational justice theory and research recognizes four dimensions (or aspects) of justice (Colquitt, 2001; Colquitt et al., 2001, 2005). Distributive justice refers to the perceived fairness of the distribution of resources, rewards, penalties, and other outcomes. Procedural justice refers the perceived fairness of resource the procedures that lead to these outcomes. Interpersonal justice refers to the perceived fairness of the interpersonal treatment that people receive in the enactment of organizational procedures. Informational justice refers the perceived fairness of the amount, accuracy, and timeliness of information provided about the procedures used and the outcomes received. Interpersonal justice and informational justice differ from procedural justice in that both emphasize the social—as opposed to structural—aspects of decision-making processes. While often highly correlated, these four dimensions of justice represent distinct constructs that are empirically distinguishable from one another, linked to different antecedent conditions, and correlated in varying degrees with different outcomes (CohenCharash & Spector, 2001; Colquitt, 2001; Colquitt et al., 2001; Viswesvaran & Ones, 2002). Affective Reactions Decision/outcome Satisfaction Job Satisfaction Trust in Supervisors Perceived Legitimacy of Decision-makers and Institutions Perceived Organizational Support Organizational Commitment
Behavioral Reactions Decision/outcome Compliance Job performance Absenteeism./Turnover Employee Silence Counterproductive work behavior Helping behavior Teamwork Organizational Citizenship Behavior
Cultural Influences • Professional Culture • Specialty Culture Fig. 1. Proposed antecedents and consequences of perceived justice in incident reporting processes.
ARTICLE IN PRESS 408
B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
Antecedents of justice perceptions: Organizational justice perceptions vary as a function of both the formal attributes of decision-making processes as well as the informal (social) aspects of decisionmaking (see Fig. 1). For example, research shows that people perceive greater distributive justice when the distribution rules for allocating outcomes match implicit norms for allocation. Three common distribution rules are the contributions rule (e.g., those contributing more should receive higher outcomes), the needs rule (e.g., those with greater needs should receive higher outcomes), and the equality rule (e.g., people should receive the same outcomes regardless of needs or contributions) (Leventhal, 1980). In the incident reporting context, the contributions rule can be viewed as a proportionality rule. That is, people perceive greater distributive justice if: (a) the likelihood of disciplinary action depends on the level of perceived culpability, and (b) the severity of disciplinary action seems proportional to the seriousness of the unsafe act. If formal policies exist that describe the disciplinary consequences for particular unsafe acts, the legality rule may also apply. That is, people perceive greater distributive justice if disciplinary actions (i.e., outcomes) are consistent with existing laws, policies, and regulations. Similarly, people perceive greater procedural justice when they experience procedural rules indicative of fair play (Leventhal, 1980; Thibaut & Walker, 1975). As suggested above, people may value ‘‘fair play’’ rules because they allow some control over the allocation of outcomes (an instrumental explanation) or because they signal the value of belonging to, or working on behalf of, the organization (a relational explanation). Procedural rules include:
Process control (e.g., ability to voice one’s own views during the procedure); Decision control (e.g., ability to influence the actual outcome itself); Consistency (e.g., the process is applied consistently across people and time); Bias suppression (e.g., decision makers are neutral); Accuracy of information (e.g., procedures are not based on inaccurate information); Correctability (e.g., appeal procedures exist); Representation (e.g., all subgroups affected by the decision are heard from); Ethicality (e.g., the process upholds personal standards of ethics and morality).
Finally, social aspects of decision-making systems influence perceptions of interpersonal and informational justice (Bies, 2005; Bies & Moag, 1986). People perceive greater interpersonal justice when they have been treated with respect (e.g., decision makers are polite rather than rude) and propriety (e.g., decision makers keep promises and refrain from making prejudicial statements or pejorative remarks). Similarly, people perceive greater informational justice when they have received truthful information (e.g., decision makers are candid and accurate) and adequate justification (e.g., decision makers provide timely and specific information about the basis for decisions). The moral virtues model holds that people expect others to act in accordance with general principles of moral conduct. Such conduct includes treating people with dignity, giving fair warning of anticipated unfavorable outcomes, and explaining or accounting for unexpected outcomes or deviation from standard procedures (Bies, 2005). Organizational justice theory and research suggests, therefore, that health professionals judge the fairness of incident reporting systems based not just on the outcomes that they receive (distribution rules), but also on the procedures used to decide the outcomes (procedural rules) and the way that they are treated by decision makers (the social aspects of decision-making processes). This multidimensional view of justice contrasts with the narrow focus on distributive justice evidenced in prominent writings about just culture (Marx, 2001; Reason, 1997). Cultural influences: Meta-analyses suggest that the justice dimensions and rules identified above are highly robust with respect to differences in organizational context (e.g., structure and culture) (Cohen-Charash & Spector, 2001; Colquitt et al., 2001; Viswesvaran & Ones, 2002). Because so little justice research has been conducted in healthcare settings; however, it remains unclear whether the perceived salience of justice dimensions and rules varies as a function of cultural differences between or within professions. It is possible, for example, that nurses give greater weight than do physicians to interpersonal and procedural dimensions of justice given the heavy emphases placed in the nursing profession on the value of caring and on the importance of documentation. Likewise, it is possible that differences in perceived teamwork, authority gradients, and safety climates among internal medicine, critical care, and surgical specialties (Naveh et al., 2006; Sexton, Thomas, & Helmreich, 2000) could
ARTICLE IN PRESS B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
translate into differences in what these health professionals consider fair or unfair with respect to incident reporting. Given the paucity of research bearing on the subject, we can only speculate as to whether different types of health professionals have different ideas about what constitutes a just incident reporting process, either in terms of the attributes that they identify or the importance they ascribe to those attributes. Such a high level of uncertainty underscores the necessity and value of exploratory descriptive research. Justice-related outcomes. Organizational justice research indicates that people’s justice perceptions are linked to several affective and behavioral reactions that seem globally relevant to patient safety. Affective reactions include decision/outcome satisfaction, job satisfaction, trust in supervisors, perceived legitimacy of decision makers and institutions, perceived organizational support, and organizational commitment (Cohen-Charash & Spector, 2001; Colquitt et al., 2001; Viswesvaran & Ones, 2002). Behavioral reactions include decision compliance, job performance, absenteeism and turnover, employee silence, counterproductive work behavior, helping behavior, teamwork, and organizational citizenship behavior (Cohen-Charash & Spector, 2001; Colquitt et al., 2001; Viswesvaran & Ones, 2002). Like some justice scholars (Moorman & Byrne, 2005), we see affective reactions as likely mediators of the relationship between justice perceptions and behavioral reactions because they provide plausible explanations for how and why justice perceptions influence people’s actions. Here, we call attention to three behavioral reactions that seem particularly relevant to incident reporting and patient safety: decision compliance, employee silence, and organizational citizenship behavior. Briefly, health professionals are more likely to accept—and therefore comply with—the decisions or outcomes of incident reporting processes, even if unfavorable, if they perceive that: (a) the proportionality rule has been followed (e.g., the ‘‘punishment fits the crime’’); (b) the rules of fair play have been followed (e.g., they haven’t been singled out); and (c) they have been treated with dignity and respect. Alternatively, health professionals are more likely to reject the decisions or outcomes of incident reporting processes if they perceive that the outcomes, procedures, or social aspects of the incident reporting process were unfair. Decision or outcome rejection, in turn, may provoke active forms of resistance (e.g., complaining, appealing, or
409
litigating) or passive forms of resistance (e.g., making derogatory remarks, ‘‘working to rule,’’ or behaving uncooperatively). Similarly, health professionals’ perceptions of justice in incident reporting may enhance or diminish their trust in supervisors, their sense of obligation to follow incident reporting rules, or their identification with the organization. These affective reactions, in turn, might positively or negatively influence their future reporting behavior (underreporting can be seen as a form of employee silence). Finally, health professionals’ affective reactions to perceived justice in incident reporting might also affect their willingness to serve on patient safety, participate in quality improvement efforts, or otherwise engage in desired, but often unrewarded forms of ‘‘good citizenship’’ behavior. Although it is possible that some of these affective reactions exhibit a stronger relationship than others to particular behavioral reactions, Moorman and Byrne (2005) note that investigating the relative contributions of these potential mediators is difficult because these affective reactions are often highly correlated. For healthcare leaders interested in establishing a just culture in their institutions, specifying the exact casual pathways through which just incident reporting processes lead to desirable health professional behavior may be of secondary concern. Discussion In his testimony before the US Congress, Dr. Lucian Leape, one of the foremost experts on medical error, stated, ‘‘The single greatest impediment to error prevention is that we punish people for making mistakes’’ (Leape, 1999). Since that time, experts have urged healthcare organizations to drop the punitive (‘‘name and blame’’) approach to incident reporting and instead establish a just culture, wherein health professionals feel assured that they will receive fair treatment when they report incidents or provide other safety-related information. In this article, we drew upon organizational justice theory and research to clarify the multiple meanings of justice and describe plausible antecedents and consequences of perceived justice in incident reporting systems. We believe that organizational justice theory and research can be useful for understanding health professionals’ attitudes and behavior in incident reporting and for developing fair incident reporting systems. However, because
ARTICLE IN PRESS 410
B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
few studies have examined organizational justice in healthcare settings (Agho, 1993; Carson, Carson, Yallapragada, & Roe, 2001; Davis et al., 1999; Kluska, Laschinger, & Kerr, 2004; Konovsky & Pugh, 1994; Laschinger & Finegan, 2005; Laschinger, 2004; Sutinen, Kivimaki, Elovainio, & Virtanen, 2002), the possibility exists that the justice dimensions and rules identified in other studies do not adequately capture the distinctiveness of the healthcare setting generally or the incident reporting context specifically. Incident reporting in healthcare involves professionals (as opposed to non-professional employees), providing information that could result in sanctions to themselves or to others (as opposed to rewards or resources), in a litigious atmosphere. While none of these features is unique to healthcare, their combination represents a distinctive configuration not found in other sectors or industries. We suspect that the justice dimensions and rules identified in other studies reflect general, albeit culture-bound, principles of moral conduct, and that health professionals judge the fairness of incident reporting processes using these dimensions and rules. Rather than assume this is the case, however, we encourage future research to assess whether the justice dimensions and rules described above adequately capture the attributes of a fair incident reporting process as perceived by health professionals, or whether new justice dimension and rules are needed to reflect the distinctive context of healthcare delivery. We also encourage future research to examine whether health professionals assign greater value to some justice dimensions (e.g., distributive versus procedural justice) or justice rules (e.g., consistency versus correctability) than to others. Likewise, we encourage future research to examine the value that health professionals assign to different justice dimensions or rules depend on provider type (e.g., nurses versus physicians), specialty training (e.g., medicine versus surgery), or organizational context (e.g., acute care versus long-term care). Metaanalyses of justice studies suggest that the justice dimensions and rules described above are robust with respect to individual differences in gender, age, race/ethnicity, job tenure, organizational tenure, and employment setting (Cohen-Charash & Spector, 2001; Colquitt et al., 2001; Viswesvaran & Ones, 2002). At the present time, we simply do not know whether health professionals hold different views on what constitutes a just incident reporting process depending the professional training they have
received, the role that they play in healthcare delivery, or the context in which they work. Moreover, we do not know whether such differences, if they exist, actually make a difference in terms of how health professionals respond in reporting situations. Answers to such questions are important, however, to those who must decide whether to implement standardized or tailored incident reporting procedures in healthcare settings.
Conclusion Given the pervasive underreporting of safety incidents that occurs in healthcare delivery (Joshi, Anderson, & Marwaha, 2002), some despair that healthcare organizations can ever create cultures that support robust incident reporting. Some hope that healthcare organizations can lessen their reliance on voluntary reporting by adopting sophisticated clinical information systems (Agency for Healthcare Research and Quality, 2001; Institute of Medicine, 2003). Yet, lessons from other industries suggest that electronic systems for detecting, recording, and signaling safety incidents can only complement voluntary reporting processes; they cannot replace them. Others hope to reduce dependence on voluntary reporting by instituting mandatory reporting requirements. In the US, 23 states have instituted mandatory event reporting systems for hospitals; yet few states can point to actual successes using these systems to improve safety in hospitals (Gaba, 2000; Wood & Nash, 2005). Moreover, states continue to struggle with ‘‘compliance’’ (i.e., under-reporting) (Wood & Nash, 2005). Experience with mandated reporting systems in other industries indicates that all reporting, to some extent, is voluntary (Barach & Small, 2000). If health professionals do not feel that they can expect fair treatment when they report safety incidents, mandatory reporting will only increase the level of fear and drive valuable safety-related information underground. Indeed, some suggest that a key factor influencing the level of mandatory external reporting is providers’ perception of their own hospital’s culture of safety (Wood & Nash, 2005). If, as safety experts believe, voluntary reporting is the lifeblood of an ‘‘informed culture’’ (Reason, 1990), then greater knowledge of what constitutes a just incident reporting process in the eyes of health professionals should prove useful to those who wish to create a culture of safety in healthcare
ARTICLE IN PRESS B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
organizations and, in doing so, provide safer, higher quality care.
References Agency for Healthcare Research and Quality. (2001). Making health care safer: A critical analysis of patient safety practices. Prepared by University of California at San Francisco, Stanford University Evidence-Based Practice Center, Contract no. 290-97-0013. Agency for Healthcare Research and Quality. (2005). 30 Safe practices for better health care. Fact Sheet (http://www.ahrq. gov). Agency for Healthcare Research and Quality. Agho, A. O. (1993). The moderating effects of dispositional affectivity on relationships between job characteristics and nurses’ job satisfaction. Research in Nursing & Health, 16(6), 451–458. Ashford, S. J., Rothbard, N. P., Piderit, S. K., & Dutton, J. E. (1998). Out on a climb: The role of context and impression management in selling gender-equity issues. Administrative Science Quarterly, 43(1), 23–57. Barach, P., & Small, S. D. (2000). Reporting and preventing medical mishaps: Lessons from non-medical near miss reporting systems. British Medical Journal, 320(7237), 759–763. Beyea, S. C. (2004). Creating a just safety culture. Aorn Journal, 79(2), 412–414. Bies, R. J. (2005). Are procedural justice and interactional justice conceptually distinct? In J. Greenberg, & J. A. Colquitt (Eds.), Handbook of organizational justice (pp. 85–112). Mahwah, NJ: Lawrence Erlbaum Associates. Bies, R. J., & Moag, J. S. (1986). Interactional justice: Communication criteria for fairness. In R. J. Lewicki, B. H. Sheppard, & M. H. Bazerman (Eds.), Research on negotiation in organizations (pp. 43–55). Greenwich, CT: JAI Press. Blader, S. L., & Tyler, T. R. (2005). How can theories of organizational justice explain the effects of fairness? In J. Greenberg, & J. Colquitt (Eds.), Handbook of organizational justice (pp. 329–354). Mahwah, NJ: Lawrence Erlbaum Associates. Blegen, M. A., Vaughn, T., Pepper, G., Vojir, C., Stratton, K., Boyd, M., et al. (2004). Patient and staff safety: Voluntary reporting. American Journal of Medical Quality, 19(2), 67–74. Bowditch, J. L., & Buono, A. F. (2001). A primer on organizational behavior. New York: Wiley. Carson, K. D., Carson, P. P., Yallapragada, R., & Roe, C. W. (2001). Teamwork or interdepartmental cooperation: Which is more important in the health care setting? Health Care Management (Frederick), 19(4), 39–46. Clarke, S. (2006). The relationship between safety climate and safety performance: A meta-analytic review. Journal of Occupational Health Psychology, 11(4), 315–317. Cohen-Charash, Y., & Spector, P. E. (2001). The role of justice in organizations: A meta-analysis. Organizational Behavior and Human Decision Processes, 86(2), 278–321. Colquitt, J. A. (2001). On the dimensionality of organizational justice: A construct validation of a measure. Journal of Applied Psychology, 86(3), 386–400. Colquitt, J. A., Conlon, D. E., Wesson, M. J., Porter, C., & Ng, K. Y. (2001). Justice at the millennium: A meta-analytic
411
review of 25 years of organizational justice research. Journal of Applied Psychology, 86(3), 425–445. Colquitt, J. A., Greenberg, J., & Zapata-Phelan, C. P. (2005). What is organizational justice? A historical overview. In J. Greenberg, & J. A. Collquitt (Eds.), Handbook of organizational justice. Mahwah, NJ: Lawrence Erlbaum Associates. Cropanzano, R., Byrne, Z. S., Bobocel, D. R., & Rupp, D. E. (2001a). Emerging justice concerns in an era of changing psychological contracts. In R. Cropanzano (Ed.), Justice in the workplace: From theory to practice (pp. 245–265). Mahwah, NJ: Lawrence Erlbaum Associates. Cropanzano, R., Byrne, Z. S., Bobocel, D. R., & Rupp, D. E. (2001b). Moral virtues, fairness heuristics, social entities, and other denizens of organizational justice. Journal of Vocational Behavior, 58(2), 164–209. Cullen, D. J., Bates, D. W., Small, S. D., Cooper, J. B., Nemeskal, A. R., & Leape, L. L. (1995). The incident reporting system does not detect adverse drug events: A problem for quality improvement. Joint Commission Journal on Quality Improvement, 21(10), 541–548. Davis, D., O’Brien, M. A., Freemantle, N., Wolf, F. M., Mazmanian, P., & Taylor-Vaisey, A. (1999). Impact of formal continuing medical education: Do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? Journal of the American Medical Association, 282(9), 867–874. Dutton, J. E., Ashford, S. J., Lawrence, K. A., & Miner-Rubino, K. (2002). Red light, green light: Making sense of the organizational context for issue selling. Organization Science, 13(4), 355–369. Dutton, J. E., Ashford, S. J., Oneill, R. M., Hayes, E., & Wierba, E. E. (1997). Reading the wind: How middle managers assess the context for selling issues to top managers. Strategic Management Journal, 18(5), 407–423. Edmondson, A. C. (2002). The local and variegated nature of learning in organizations: A group-level perspective. Organization Science, 13(2), 128–146. Elder, N. C., Graham, D., Brandt, E., & Hickner, J. (2007). Barriers and motivators for making error reports from family medicine offices: A report from the American academy of family physicians National Research Network (AAFP NRN). Journal of the American Board of Family Medicine, 20(2), 115–123. Emery Center on Health Outcomes in Quality. (2004). Interventions to improve safety culture. Georgia Hospital Association. Gaba, D. M. (2000). Structural and organizational issues in patient safety: A comparison of health care to other highhazard industries. California Management Review, 43(1), 83–102. Gillespie, J. Z., & Greenberg, J. (2005). Are the goals of organizational justice self-interested? In J. Greenberg, & J. Colquitt (Eds.), Handbook of organizational justice (pp. 179–214). Mahwah, NJ: Lawrence Erlbaum Associates. Greenberg, J. (1987). A taxonomy of organizational justice theories. Academy of Management Review, 12(1), 9–22. Institute of Medicine. (2000). To err is human. Washington, DC: National Academy Press. Institute of Medicine. (2003). Patient safety: Achieving a new standard for care. Washington, DC: National Academy of Sciences. Jeffe, D. B., Dunagan, W. C., Garbutt, J., Burroughs, T. E., Gallagher, T. H., Hill, P. R., et al. (2004). Using focus groups to understand physicians’ and nurses’ perspectives on error
ARTICLE IN PRESS 412
B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413
reporting in hospitals. Joint Commission on Journal of Quality and Patient Safety, 30(9), 471–479. Joshi, M. S., Anderson, J. F., & Marwaha, S. (2002). A systems approach to improving error reporting. Journal of Healthcare Information Management, 16, 40–45. Kadzielski, M. A., & Martin, C. (2002). Assessing medical error in health care. Developing a ‘‘culture of safety’’. Health Progress, 83(6), 31–35. Karadeniz, G., & Cakmakci, A. (2002). Nurses’ perceptions of medication errors. International Journal of Clinical Pharmacology Research, 22(3–4), 111–116. Kingston, M. J., Evans, S. M., Smith, B. J., & Berry, J. G. (2004). Attitudes of doctors and nurses towards incident reporting: A qualitative analysis. Medical Journal of Australia, 181(1), 36–39. Kizer, K. W. (1999). Large system change and a culture of safety. Enhancing patient safety and reducing errors in health care. Chicago: National Patient Safety Foundation. Kluska, K. M., Laschinger, H. K., & Kerr, M. S. (2004). Staff nurse empowerment and effort-reward imbalance. Canadian Journal of Nursing Leadership, 17(1), 112–128. Konovsky, M. A., & Pugh, S. D. (1994). Citizenship behavior and social-exchange. Academy of Management Journal, 37(3), 656–669. Kopp, B. J., Erstad, B. L., Allen, M. E., Theodorou, A. A., & Priestley, G. (2006). Medication errors and adverse drug events in an intensive care unit: Direct observation approach for detection. Critical Care Medicine, 34(2), 415–425. Laschinger, H. K., & Finegan, J. (2005). Using empowerment to build trust and respect in the workplace: A strategy for addressing the nursing shortage. Nursing Economics, 23(1), 6–13. Laschinger, H. K. S. (2004). Hospital nurses’ perceptions of respect and organizational justice. Journal of Nursing Administration, 34(7-8), 354–364. Leape, L. L. (1999). Testimony. Washington, DC: United States Congress House Committee on Veterans’ Affairs. Leventhal, G. S. (1980). What should be done with equity theory? In K. J. Gregen, M. S. Greenberg, & R. H. Willis (Eds.), Social exchange: Advances in theory and research. New York: Plenum Press. Manasse, H. R., Jr., Eturnbull, J., & Diamond, L. H. (2002). Patient safety: Review of the contemporary American experience. Singapore Medical Journal, 43(5), 254–262. Marx, D. (2001). Patient safety and the ‘Just Culture’: A primer for health care executives. New York: Trustees of Columbia University. Milliken, F. J., Morrison, E. W., & Hewlin, P. F. (2003). An exploratory study of employee silence: Issues that employees don’t communicate upward and why. Journal of Management Studies, 40(6), 1453–1476. Moorman, R. H., & Byrne, Z. S. (2005). How does organizational justice affect organizational citizenship behavior? In J. Greenberg, & J. Colquitt (Eds.), Handbook of organizational justice (pp. 355–382). Mahwah, NJ: Lawrence Erlbaum Associates. Naveh, E., Katz-Navon, T., & Stern, Z. (2006). Readiness to report medical treatment errors: The effects of safety procedures, safety information, and priority of safety. Medical Care, 44(2), 117–123. Neal, A., Griffin, M. A., & Hart, P. M. (2000). The impact of organizational climate on safety climate and individual behavior. Safety Science, 34(1-3), 99–109.
No Author. (2002). Hospitals honored for promoting a culture of safety. BWH Press Room. O’Leary, D. (2003). Patient safety: Instilling hospitals with a culture of continuous improvement. Washington, DC: Committee on Governmental Affairs. Olsen, S., Neale, G., Schwab, K., Psaila, B., Patel, T., Chapman, E. J., et al. (2007). Hospital staff should use more than one method to detect adverse events and potential adverse events: Incident reporting, pharmacist surveillance and local real-time record review may all have a place. Quality and Safety in Health Care, 16(1), 40–44. O’Neil, A. C., Petersen, L. A., Cook, E. F., Bates, D. W., Lee, T. H., & Brennan, T. A. (1993). Physician reporting compared with medical-record review to identify adverse medical events. Annals of Internal Medicine, 119(5), 370–376. Pizzi, L. T., Goldfarb, N. I., & Nash, D. B. (2001). Promoting a culture of safety. In K. G. Shojania, B. W. Duncan, K. M. McDonald, & R. M. Wachter (Eds.), Making health care safer: A critical analysis of patient safety practices. Evidence report/technology assessment no. 43) (pp. 447–457). Rockville, MD: Agency for Healthcare Research and Quality. Reason, J. (1990). Human error. Cambridge: Cambridge University Press. Reason, J. (1997). Managing the risks of organizational accidents. Brookfield, VT: Ashgate Publishing Company. Reason, J. (2000). Human error: Models and management. British Medical Journal, 320(7237), 768–770. Sexton, J. B., Thomas, E. J., & Helmreich, R. L. (2000). Error, stress, and teamwork in medicine and aviation: Cross sectional surveys. British Medical Journal, 320(7237), 745–749. Singer, S. J., Gaba, D. M., Geppert, J. J., Sinaiko, A. D., Howard, S. K., & Park, K. C. (2003). The culture of safety: Results of an organization-wide survey in 15 California hospitals. Quality and Safety in Health Care, 12(2), 112–118. Sutinen, R., Kivimaki, M., Elovainio, M., & Virtanen, M. (2002). Organizational fairness and psychological distress in hospital physicians. Scandinavian Journal of Public Health, 30(3), 209–215. Thibaut, J., & Walker, L. (1975). Procedural justice: A psychological analysis. Hillsdale, NJ: Erlbaum. Tyler, T. R., & Blader, S. L. (2003). The group engagement model: Procedural justice, social identity, and cooperative behavior. Personality and Social Psychology Review, 7(4), 349–361. University of Michigan Medical School. (2005). Improving Patient Safety in Hospitals: Turning Ideas into Action. Viswesvaran, C., & Ones, D. S. (2002). Examining the construct of organizational justice: A meta-analytic evaluation of relations with work attitudes and behaviors. Journal of Business Ethics, 38(3), 193–203. Wakefield, B. J., Blegen, M. A., Uden-Holman, T., Vaughn, T., Chrischilles, E., & Wakefield, D. (2001). Organizational culture, continuous quality improvement, and medication administration error reporting. American Journal of Medical Quality, 16(4), 128–134. Wakefield, D. S., Wakefield, B. J., Uden-Holman, T., Borders, T., Blegen, M., & Vaughn, T. (1999). Understanding why medication administration errors may not be reported. American Journal of Medical Quality, 14(2), 81–88. Wald, H., Shojania, K. G. (2001). Incident reporting. In D. B. Shojania, K.G., McDonald, K.M., et al., (Eds.), Making health care safer: A critical analysis of patient safety practices. Evidence report/technology assessment no. 43 (Prepared by
ARTICLE IN PRESS B.J. Weiner et al. / Social Science & Medicine 66 (2008) 403–413 the University of California at San Francisco, Stanford Evidence-based Practice Center under contract no. 290-970013), AHRQ publication no. 01-E058, Rockville, MD: Agency for Healthcare Research and Quality. Welsh, C. H., Pedot, R., & Anderson, R. J. (1996). Use of morning report to enhance adverse event detection. Journal of General Internal Medicine, 11(8), 454–460.
413
Wild, D., & Bradley, E. H. (2005). The gap between nurses and residents in a community hospital’s error-reporting system. Joint Commission on Journal of Quality and Patient Safety, 31(1), 13–20. Wood, K. E., & Nash, D. B. (2005). Mandatory state-based error-reporting systems: Current and future prospects. American Journal of Medical Quality, 20(6), 297–303.