Decision Support Systems 44 (2007) 235 – 250 www.elsevier.com/locate/dss
Efficiency of critical incident management systems: Instrument development and validation Jin Ki Kim a,⁎, Raj Sharman b , H. Raghav Rao b,c , Shambhu Upadhyaya c b
a Department of Business Administration, Korea Aerospace University, Korea Management Science and Systems, School of Management, University at Buffalo, The State University of New York, United States c Computer Science and Engineering, School of Engineering and Applied Sciences, University at Buffalo, The State University of New York, United States
Received 11 July 2006; received in revised form 14 March 2007; accepted 8 April 2007 Available online 19 April 2007
Abstract There is much literature in the area of emergency response management systems. Even so, there is in general a lacuna of literature that deals with the issue of measuring the effectiveness of such systems. The aim of this study is to develop and validate an instrument to measure the critical factors that contribute to the efficiency of decision support in critical incident management systems (CIMS). The instrument presented in this paper has been developed using a CIMS efficiency model that is based on an adaptation of media richness theory, aspects of the national incident management system (NIMS) and interviews with experts on emergency management. The instrument has been validated through a pretest, followed by a pilot test and, finally, a main field test which includes a survey of 76 experts. The final instrument consists of 28 statistically relevant question items, which form eight constructs. The instrument allows communities to assess both strengths and weaknesses of existing systems. This allows communities to better prepare for disasters as it informs both policies and practice on areas of weakness that need addressing. © 2007 Elsevier B.V. All rights reserved. Keywords: Critical incident management system (CIMS); Decision support; Emergency response systems; Instrument; Measurement; Media richness theory; National incident management systems (NIMS); Validation
1. Introduction There is much literature in the area of emergency response management systems and critical incident management systems (CIMS). Even so, there is in general a lacuna of literature that deals with the issue of measuring the effectiveness of such systems. Clearly, a poor assessment of CIMS efficiency limits the possibility of mitigating future problems [17]. The issue of ⁎ Corresponding author. E-mail address:
[email protected] (J.K. Kim). 0167-9236/$ - see front matter © 2007 Elsevier B.V. All rights reserved. doi:10.1016/j.dss.2007.04.002
assessment of critical incident management systems (CIMS) is an understudied area. Through the rest of this paper we use the term emergency response management systems synonymously with critical incident management systems. This paper provides an insight into the critical factors that affect decision support in the context of emergency response management. Further this paper develops and validates an instrument to measure the factors that contribute to the efficiency of decision support by critical incident management systems (CIMS). This instrument can be used to capture the strengths and
236
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
weaknesses of existing systems. This is useful from a preparedness perspective as it allows communities to address weaknesses prior to an event. Therefore this research will contribute both to policy and practice. The contributions of this paper are two fold. First, we develop a conceptual CIMS efficiency model which draws input from media richness theory as well as from the practice of emergency management. The model adapts concepts from media richness theory [12,66], references several components of the national incident management system (NIMS) [67] and interviews experts [3,5,29,30,33,41–43,50–52] who are firstresponders. Second, this paper presents the instrument to measure the efficiency of the CIMS for decision support. This instrument has been elaborated through a pretest, a pilot test, and several discussions with firstresponders and it has also been validated by a field test. This paper is structured into the following sections. Section 2 provides background on CIMS and also includes literature on disaster response management. The research foundation–which includes a discussion of media richness theory, NIMS, and interviews with experts–forms the content of Section 3. Section 4 highlights the efficiency model of CIMS; it is on the basis of this model that we have developed the instrument to measure the efficiency of CIMS. The research methodology is explained in Section 5. The results are detailed in Section 6. Discussion and conclusions are presented in Section 7. 2. Background Most emergency management systems have their own objectives, features, characteristics, and structures. In 1971, the Emergency Management Information System for the Wage Price Freeze (EMISARI) was used to effectively deal with transportation strikes, coal strikes, petroleum, chlorine, and natural gas shortages, and more severe natural disasters. The system design focused on group communication process and how humans gather, contribute, and utilize data in a time-urgent manner [64]. DisasterLAN (http://www.disasterlan.com/) is a system for managing all aspects of an Emergency Operations Center (EOC). It is centered upon a core set of tools that can be used to manage incidents of any size. Core functions include a call center, an incident status board, chat with integrated message broadcasting, asset management, contact management, financial tracking, a dynamic reference and resource database, streaming video display, configuration and security management, and numerous reporting and task management tools. E Team (http://www.eteam.com/index.html) is an emergency management system that allows those involved in
responding to an actual emergency with the ability to collaborate and manage their efforts, across multiple organizations, from a single common view and coordination point. The system also provides a secure internet portal and is easy to implement, use, and scale. Emergency Medical Services (EMS) systems are generally composed of four elements: mayday call, routing and dispatch, response, and treatment [32]. Emergency Response Management Information Systems (ERMIS) is an integration of emergency response processes and continuous decision process auditing [65]. Though there are many systems currently in use, none of these systems comprehensively address all issues – people, process and technology. Also there is no assessment on the performance of these systems or the metrics that should be used to measure the performance of such systems. To our knowledge, there is also no academic study that deals with the assessment of the critical factors that influence the performance or the efficiency of decision support for critical incident management systems. This paper attempts to fill that gap by developing a conceptual model of critical factors that affect the efficiency of decision support for a critical incident management system. The paper also develops an instrument based on the model that allows for the assessment of existing systems. In the remainder of this section we present the general structure of CIMS. Several related works on disaster research are also discussed in the context of emergency management systems. 2.1. CIMS A critical incident management system (CIMS) is a system that utilizes people, processes, and technologies for managing critical incidents. CIMS, also known as emergency management systems, deals with natural and human made disasters. Fig. 1 outlines general information about CIMS, including the flow of action underlying the system. Typically when a witness to an accident or disaster calls a 911 call center, the center notifies the main (dispatch) office which is usually a fire department. The fire department forwards the information to the unified command center (UCC) when the incident requires the cooperation of both the police and medical systems. The UCC then estimates the number of first-responders and the resource capacity of each department (i.e. fire, police, and emergency medicine) that would be needed to mitigate the incident. The UCC gives administrative directions to each department who then perform their tasks independently. The UCC will also interact with the
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
237
Fig. 1. Information and action/direction flow of the CIMS.
federal emergency management agency (FEMA) and the national guard/military if the magnitude of the event is beyond its scope and limitations. The nature of the incident determines the primary department of the UCC. For example, in the case of biological incidents, the emergency medicine system takes charge. Likewise, given the occurrence of crimes such as theft, sabotage or terrorist attacks, the police department plays the leading role. The FBI becomes the lead agency in the case of most terrorist related events. 2.2. Literature review The management of a disaster response includes but is not limited to activities such as incident management, law enforcement, public communication, debris removal management, managing resources, managing infrastructure interdependencies, volunteer management systems, shelter provisioning, evacuation, surge in public health systems, cross-border issues, etc. The literature on disaster studies covers several research streams, such as process and scenario analysis, modeling and simulation, qualitative studies [47,49]. Establishing how to prepare for disasters or incidents is a primary issue. Several researchers have studied how to improve the process of incident management and how to evaluate different scenarios in order to prevent future
disasters. Killian [36] identifies functional time phases such as warning, impact, emergency, and recovery. Pelfrey [48] proposes a cycle of preparedness which consists of prevention, awareness, response, consequence management and recovery. Sunstein [59] points out that regulators and government officials should identify worse-case scenarios; he also elaborates on an ideal approach which would eliminate the worst aspects of an impending catastrophe. Kim, et al. [37,38] propose a conceptual model to explain the efficiency of CIMS. A number of modeling and simulation tools have been developed for incident management applications. A set of tools that evaluate the coordinated efforts needed to deal with multiple aspects of a disaster can substantially improve incident management capabilities. French and Niculae [27] explored the issue of how to model emergency management process predictions. Jain and McLean [34] proposed a framework to facilitate the application of modeling and simulation to incident management. The framework addresses incident management on three axes — incident, domain and lifecycle phase. It is designed to allow the use of modeling and simulation across the incident management lifecycle, which includes prevention, preparedness, response, recovery and mitigation. Winterfeldt [72] focuses on cost-benefit and multiattribute methods to evaluate alternative proposals for protecting New Orleans against hurricanes [73].
238
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
Chengalur-Smith et al. [8] point out that the components and linkages for systems can be defined through the identification of parallels with disaster-management scenarios. There have been some studies on risk communication and risk perception. Lombardi [40] identifies a gap between statistical evidence and the perceived interpretation of risk by using the psychometric matrix of Slovic [55]. Daniels et al. [18] emphasize the importance of advance assessment of the risk associated with a disaster in the context of gauging how the risk is perceived and how people will react to the risk. From a view of social capital, Dynes [23] recognizes disaster as a threat to existing capital. Many studies deal with prediction and prevention in terms of emergency management. However, there is little research from the perspective of first-responders or local incident management systems. This study aims to address this gap. 3. Research foundation The conceptual model developed in this paper is based on the incorporation and adaptation of concepts drawn from media richness theory, analysis of the existing NIMS system, and discussion with experts in the field of local incident management systems. 3.1. CIMS efficiency model Media richness theory has been used in the information systems field with much success [39]. Media richness theory, also known as information richness theory, proposes that task performance will be improved when task needs are connected to a medium that can effectively convey information [11,12,14,53]. Media richness theory argues that certain media are more efficient in transmitting information, depending on whether the information is used in situations of uncertainty or equivocality [11,12,15]. It also suggests that effective managers make rational choices regarding a particular communication medium for a specific task or objective and they also make their choice according to the degree of richness required by that task [12,13,21,24,35,63]. Media richness theory supports several concepts which can be used to explain the efficiency of the CIMS. For example, information processing requirements to mitigate uncertainty and equivocality depends on the type of technology, degree of interdepartmental integration, and the nature of the environment that is being scrutinized [12]. These influential factors are adapted in this paper to best explain the efficiency of the CIMS.
3.1.1. Technology Technology, in this context, includes the knowledge, tools, and techniques used to transform inputs into organizational outputs [11]. Communication is one of the critical technologies for emergency response systems. For example, rapid and clear communication increases the quality of decisions [35]. Communication media differ in their ability to facilitate understanding [15] and different media can overcome various situational constraints such as time, location, permanence, distribution and distance [16,53]. Communication effectiveness and richness of perception are strongly bounded by a channel user's communication experiences [6]. Choosing one single medium for any task may prove less effective than choosing a set of media which the group uses at different times in performing the task, depending on the current communication process [21]. The conveyance of information depends on the dissemination of diverse information from many sources, information which was not previously known to participants. The goal in such a conveyance is to disseminate and obtain as much relevant information as possible to aid in understanding the situation at hand [21]. 3.1.2. Processes People approach the problem with different experiences, cognitive elements, goals, values, and priorities [12]. Similarly, interdependence in this context means the extent to which departments depend upon each other to accomplish their tasks [61]. Each department develops its own functional specialization, time horizon, goals, frame of reference and jargon. Bridging wide differences across departments is a problem of equivocal reduction [12]. The amount of task interdependence that exists between different subunits is associated with the need for effective joint problem solving [66]. 3.1.3. External factors—regulation The ways in which internal organizational communication creates shared interpretations of information, and the concomitant use of communication media, is influenced by characteristics of the environment and the methods used to acquire and process information [53]. The environment is a major factor in organizational structure and internal processes. As an open system, an organization cannot seal itself off from the environment. The organization must have mechanisms to learn about and interpret external events [11]. Given several alternative means to achieve greater information processing capacity between subunits, complete specification of the most appropriate set of coordinating and control
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
mechanisms such as formal systems, lateral relations, or both, will be contingent on the nature of the data and other organizational conditions [66]. Differences in environments generate dissimilar information processing requirements as manifested in specific organizational tasks [53]. 3.1.4. Efficiency of decision support In media richness theory, effectiveness [12,66] and performance [20] are considered to be the dependent variables. Effectiveness is a function of matching information processing capacities with information processing requirements. Performance is measured through communication satisfaction and perceptions of richness, equivocality and task complexity. A successful performance can be determined if it results in the following three aspects: the ability to make better decisions, establish shared systems of meaning, and make a better use of time [63]. Decision quality is measured using expert judgments [35]. For the purposes of this study, we focus only on the system and are thus more concerned with efficiency rather than effectiveness. 3.2. National Incident Management System (NIMS) On February 28, 2003, the President of the United States issued the Homeland Security Presidential Directive (HSPD)-5, Management of Domestic Incidents, which allows the Secretary of Homeland Security to develop and administer a national incident management system (NIMS). The NIMS provides a consistent nationwide approach for responding to all kinds of incidents regardless of scope and size. The NIMS is an emergency management organization system that integrates several component parts. These components include preparedness, resource management, communications and information management, and supporting technologies. For each component, concepts, principles and guidelines are presented. Based on the composition of NIMS, we can narrow down the major factors that influence incident management systems. Efficient command systems and effective management provide a platform for managing emergencies properly. For the various incidents that may arise, well-organized pre-planning helps first-responders to react to the disaster rapidly and correctly. Communications and information are the basic infrastructure for first-responders' activities. The efficient management of communication and information impacts the performance of incident management systems directly. Therefore, several other supporting technologies such as devices with GIS, sensors etc. are required to improve the efficiency of the response.
239
In this paper we adopt the components of NIMS as the critical factors that influence the efficiency of the CIMS. 3.3. Experts' opinions The role of first-responders is critical, especially because their knowledge and experience is a consequence of lessons learned from real situations. In order to solicit expert opinion we met with senior first-responders who had over a decade of experience as incident commanders. This was done with a view to integrate their perspective into our research [22,62]. All the experts we interviewed felt that one of the most important issues plaguing emergency management practice is the availability and use of communication channels [29,42,43,50,51]. Communication channels between agencies help first-responders to communicate their needs, and priorities as the emergency situation unfolds on-site. Problems in communication make conditions for first responders more hazardous and critically hamper the performance. Experts also felt that the performance of first-responders often depends on inter-department and interagency information sharing. Mutual cooperation among first-responders is essential. For efficient cooperation, information should be transmitted rapidly and accurately along all communication channels. First-responder feedback that is given after completing a task may be an essential input to the next task that may need to be performed by another group of first-responders belonging to a different agency [3,5,43]. Inadequate or incorrect responses to an incident may also occur because of barriers to information sharing due to guidelines on privacy and security [41,50]. According to the experts, excessive regulation on privacy and security could adversely influence efficient information sharing. According to first-responders, cooperation among different departments at an incident site is also vital. Such cooperation is essential as normally the task of each department is classified and designated in advance; each task is interlinked and sometimes they may also overlap. As further evidence of the need for cooperation, domain experts feel that an efficient response during disasters may require access to different agencies and to national resources such as HazMat and ChemTrac [5,42,43,51]. 4. Conceptual model development In this section we provide a brief discussion of the conceptual model. We detail the various factors involved in the model and discuss the dependent variable.
240
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
Certain processes and technologies are intrinsic to the framework of any emergency management system. In terms of the CIMS, technology means sufficient communication channels, early detection, and rapid response. Processes have three components: information sharing between first-responders, reliability of information channel for information sharing, and access to external agencies. External factors such as regulation, politics and governmental directives that are beyond the control of CIMS could also affect the efficiency of the system as per the conceptual model; excessive regulation on security and privacy could also prevent first-responders from reacting to incidents promptly and appropriately. 4.1. Technologies 4.1.1. Communication capacity It has long been recognized that communication plays a central role in all the activities and functions of the management of an organization [24,66]. Specifically, during emergency situations it has been noted that witnesses and victims tend to make many phone calls to first-responders and emergency agencies. These calls should be considered urgent; otherwise the extent of damages and number of victims could increase. (For example, during hurricane Katrina, the absence of adequate communication facilities hampered urgent emergency activities of first-responders [19].) Communication facilities in the U.S. are built on a model that is commercially viable when there is no disaster. During disasters, the circuits are quickly overloaded due to high call volume to and from the disaster areas. Interoperability issues arise as a consequence of a multitude of factors, ranging from the inability to use allocated communication channels (such as 80 MHz) due to outdated equipment to non-compliant standards being in use. Usually 911 systems and dispatch systems are exposed to call flooding. Communication which conveys information to the emergency response systems and to the public should have sufficient load capacity as it is important in information sharing. The emergency situations need to distribute and probably disperse groups of people to track and coordinate their activities. The Emergency Management Information System and Reference Index (EMISARI) is an incident management system that focuses on group communication processes and helps to gather, contribute, and utilize data in a time-urgent manner [64]. There are two very distinct issues involved with communication. First and foremost, it is important that the communication channels can ensure that all relevant and accurate information is transmitted to different agencies in a timely fashion. Second, the content
of signals, dependent and independent, should merge into useful information [46]. In order to ensure smooth flow of communication, all communication channels must have sufficient capacity to overcome call flooding during emergencies. Backup call handling facility for the 911 call center and dispatch system is also a must. 4.1.2. Early detection and response Early detection of an incident and rapid response often serve to lower the cost to human life and property during disasters. Preparedness to detect an incident early and to respond to it rapidly is a critical step in emergency management. Early detection is essential in order to accurately and consistently address the most important information. Early detection includes identification, characterization, evaluation and dissemination of information on possible damages. Emergency response teams have various tools, such as sensors and detectors, to monitor emergency situations. Those tools transmit signals to the command center for the case of detecting unusual subjects or changes. To detect incidents early and properly respond, the appropriate deployment of sensors and detectors for incidents are necessary. These tools should be checked for functionality periodically and be updated continuously. 4.2. Processes 4.2.1. Information sharing Information sharing is at the core of all coordination and cooperation during any kind of group activity. Information must be shared in a timely fashion in order to effectively act and make decisions during disasters [54]. In order to improve the availability of information and facilitate better sharing, the provision of information at the right time and at the right place is necessary. Technical interfaces are necessary to share information among agencies. 4.2.2. Reliable information channels In situations of uncertainty, as media richness theory explains, the medium's ability to convey information, impacts one's ability to perform the task [11,12]. Additionally, for those who are involved in the emergency situation, information about the status of the situation is essential. During an emergency, the surge of calls can paralyze the communications network. The failure of radio communications between all involved parties responding to a disaster could complicate the situation. Also, communications facilities which have access to the Internet are
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
vulnerable to cyber attack [28]. In order to ensure that information gets shared or made available to first-responders, a reliable information channel is a necessity [19]. Along with sufficient information capacity, guaranteeing reliable information channels is also critical for operational emergency management. Procedures to provide information to the public and the ability to safeguard and communicate accurate information to the public are critical. In order to provide sufficient and accurate information to first-responders and the public, it is necessary to make communication channels clear and to have identification systems and intrusion detection devices protect the communication channels. 4.2.3. Access to external agencies Emergency response systems are usually operated by local agencies. Localized response systems are efficient in the case of small local incidents. Local response teams generally consist of responders who have ample knowledge of the site they operate on. However, during catastrophic incidents or in the event of simultaneous occurrences of multiple incidents, local response systems might prove incapable. They usually do not have sufficient resources for incidents of such magnitude. Federal agencies usually have various and comprehensive information on certain types of emergency situations; the efficient information input from these agencies leads to the proper response to situations. To streamline the access of local agencies to external agencies, an effective operational interface between different agencies is essential. For example, in order to prepare for emergencies that are beyond their capacity, local systems must interact with several federal agencies that hold databases such as ChemTrac or HazMat. 4.3. External factors—regulations Regulations on security and privacy have their own purpose and rationale. These regulations are considered
241
external to the system that is in place for mitigating disasters. It is important to note that excessive regulations could hamper proper emergency activities [70]. That is, strict regulation on privacy and security hinders the sharing of information which is essential while responding to emergency situations. According to the media richness theory, environmental aspects affect the amount and richness of information processing [12]. The amount and richness in turn impacts the information processing requirements. For CIMS, security and privacy are rather critical environmental issues. 4.3.1. Excessive regulation on security Security is a common concern of all departments. In the case of human-instigated incidents, crucial information should be secure to protect against its potential misuse, which in turn could result in an increase of victims and damages. However, excessive regulation hampers firstresponders from getting useful information from other agencies. For example, security information could be misused because of varying political relationships among agencies. (That is, we have heard anecdotes about how the FBI and Secret Service agencies are often at loggerheads with each other.) In short, regulations regarding security could be misused and misunderstood which could lead to an inefficient emergency response [4]. 4.3.2. Excessive regulation on privacy Sharing of information on victims or their families could infringe on their privacy. Certain basic regulations on privacy should obviously be imposed during emergencies but then again excessive regulation could make the first-responders' activities inefficient. Overemphasis on protecting privacy discourages first-responders from gathering and sharing information on victims and their families. As a result, excessive protection leads to the inefficient working of an emergency response team.
Table 1 Components of CIMS efficiency — adaptation of media richness theory/NIMS and interviews with experts Media richness theory [12,66]; NIMS [67]; interviews with experts
CIMS efficiency model
Concepts
References
Categories
Technology
[3,5,6,11,13,14,21,29,35,42,43,50,51,53]
Technology
Interdepartmental relationships Environment
[5,13,43,51,66,69] [13,15,41,50,53]
Effectiveness
[13,20,35]
Constructs
Communication capacity Early detection and response Processes Information sharing Reliable information channels Access to external agencies External factors—regulation Excessive regulation on security Excessive regulation on privacy Efficiency of decision support (dependent variable)
242
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
Fig. 2. Decision support model in CIMS.
It is necessary to seek an optimal level of regulation on privacy such that the necessary activities of the firstresponder will be not hampered and simultaneously the clauses underlined by regulations will also be satisfied.
efficiently. In addition, resources that can be used to analyze situations assist first-responders in making decisions efficiently and responding to the emergency quickly.
4.4. Efficiency of decision support
4.5. CIMS efficiency model
We assume that the incident commander will make the right decision given the correct set of circumstances such as timely information, etc. Efficiency can be measured by the time it takes to make a decision, establish a systematic approach to reach a decision, and find the resources to efficiently analyze a situation. Emergency situations require early detection and urgent responses. Subsequently, they need to quickly make decisions about those situations. For incident commanders who have to make decisions on emergency situations, supporting decision support modules are necessary to help them in decision-making quickly and
Using the background of media richness theory, major components of the NIMS and several interviews with experts we propose several construct that explain the dependent variable (see Table 1). Fig. 2 shows the relationship between the constructs in Table 1 and the dependent variable, efficiency of decision support in CIMS. 5. Instrument development In order to measure the efficiency of CIMS using the constructs developed in the paper, it is necessary that
Fig. 3. Instrument development process.
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250 Table 2 Constructs (initial version)
Table 4 Summary of the number of respondents and the actions taken
Categories
Constructs
Abbreviation of constructs
Technologies
Communication capacity Early detection and response Information sharing Reliable information channels Access to national resources Excessive regulation Efficiency of decision support
CC
3
ED
3
IS
6
Processes
External factors Efficiency
Number of question items
Steps
Detailed process Pretest Pilot test Main field test
Survey
Responses
30
Analysis
Question items Excluded Included (constructs) Discussion Excluded with experts Reentered Split Added new Items for next version
35
76
35 39 2 4 33 (8) 35 (6)
2 3 1 39
36 8 28 (8)
1 2
RC
7
NR
4
ER
6
5.1. Item creation
EF
6
The first step in our research involved the creation of parameters to measure the efficiency of the CIMS. Item creation was done by reviewing the previously available literature. In this step, content validity was checked [9]. Content validity refers to the adequacy of the content of a measuring instrument and its relation to established literature [24]; content validity involves interviews with experts in the field of interest. Not surprisingly, experts are more aware of the nuances in the constructs when compared to a layperson [7,25]. Also, content validity is based on the extent to which a measurement reflects the specific intended domain of content [22,62].
Total
35
instrument validation should precede other core empirical validation [10]. Instrument validation refers to the adequacy with which the questionnaire addresses the parameters that are to be measured [24]. Previous literature on information systems has recommended a step-by-step process of instrument validation in order to develop new multi-item scales having high reliability and validity [9,44,56,57,71]. The instrument development process usually includes three steps: item creation, scale development, and instrument testing [44]. Instrument validation is executed as follows: there is first a pretest, then technical validation (technical validation for construct validity and technical validation for reliability), followed by a pilot test of reliability and construct validity and finally a full scale survey [57,71]. Fig. 3 is a schematic representation of the outline of the process.
Table 3 Demographics of participants for the main field test Profession
Number of participants
Percentage of Composition of first participants responders in the nation [68]
Police
23
39.68%
Fire
23
Medicine
22
30.26% (33.82%) 30.26% (33.82%) 28.95% (32.35%) 10.53% 100.00%
100.00%
Emergency 8 managers Total 76
243
35.71%
36
Table 5 Constructs in each stage Pretest
Pilot test
Main test
[1] Communication capacity [2] Early detection and response [3] Information sharing–process [4] Information sharing–technical interface [5] Reliable information channel
[1] Communication capacity [2] Early detection and response [3] Information sharing
[1] Communication capacity [2] Early detection and response [3] Information sharing
[4] Reliable information channel [5] Access to external agencies –
[4] Reliable information channel
[6] Access to external agencies [7] Excessive regulation
24.60% –
( ): Percentage of participants excluding emergency managers.
[8] Efficiency of decision support
[6] Efficiency of decision support
[5] Access to external agencies [6] Excessive regulation on security [7] Excessive regulation on privacy [8] Efficiency of decision support
244
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
Table 6 Rotated factor loadings
R: reverse-coded.
First, a draft version of the instrument was developed. Using the draft instrument, we held interviews with seven practitioners in a series of five meetings. We selected professionals who had emergency management work experience of fifteen or more years and we explained the purpose of the research and showed interviewees our instrument. The interviewees gave us suggestions for improving the instrument and, based on their recommendations, we modified the instrument. The interviewees included the director of homeland security for a major city, an incident commander, a police chief, a fire department chief and a C.E.O. of the ambulance department of a medical organization. Each interview lasted two hours. Based on these interviews additional questions were added, some rewritten, and some eliminated. After the interviews, the first version of the instrument which had thirty five question items was prepared. It had seven constructs which included the efficiency of CIMS
decision support as the dependent variable. In this version, a seven point Likert scale was used to get answers. Six question items among thirty eight were reverse-coded because the questions were asked in the negative [58,71]. Table 2 shows the initial categories and corresponding constructs for the instrument as well as the number of question items in each construct. 5.2. Pre-testing and pilot testing A pretest and a pilot test were then executed. The pretest was administered to the participants at the 15th World Conference on Disaster Management, held in Toronto, Canada in July 2005. Participants, who were mostly domain experts, were encouraged to fill out a survey. Thirty participants returned their surveys. This group consisted of a large number of Canadians (about 50%). The remainder was mostly from the United States
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
of America (USA). All the participants were professionals in the field of emergency management. Exploratory factor analysis was done using the data collected from the first version of the survey. The cut-off criteria had a factor loading of 0.5. The analysis was done using a stepwise approach. The question item which had the lowest maximum factor loading was removed. If the lowest maximum factor loading was less than 0.5, factor analysis was repeated until the lowest maximum factor loading was greater than 0.5. Four items were finally omitted. The first version of the instrument which had thirty three question items was thus validated. The instrument had eight constructs after the test. The results of the first version of the instrument were discussed with five practitioners in a series of three meetings. During that process, the instrument was further modified. During the discussion process, two question items, which had been removed during the process of factor analysis due to their lower factor loadings, were re-introduced. Three question items were split into six questions because they had multiple meanings. In addition, a new item was developed and included in the second version of instrument. The second version of the instrument was distributed among first-responders and emergency directors from different colleges and universities in the north-eastern United States. The second version of the instrument had thirty nine items which were measured using five point Likert scales. During the pilot test, thirty five responders filled out the survey. The exploratory factor analysis with a cut-off criterion of 0.5 factor loading was done yet again. As per the results, four items were eliminated. The second version of instrument had six constructs after the test. Before the main field survey, we again discussed the results of the second version of instrument with six practitioners in a series of five meetings. Consequently, one item was removed and two items which had low factor loading values were re-introduced. 5.3. Main field testing The third version of the instrument for the main field survey had thirty six question items. Seventy six first responders participated in the main field test. Table 3 shows the demographics of participants for the main field test. The first-responder community consists of local police, firefighters, and emergency medical professionals. There are over 1 million firefighters in the U.S., of which approximately 750,000 are volunteers. Local police departments have an estimated 556,000 full-time employees including about 436,000 sworn enforcement personnel. Sheriffs' offices reported about 291,000 full-time
245
Table 7 Reliability check for constructs Categories Technologies
Constructs
Communication capacity (C2) Early detection and response (C4) Processes Information sharing (C6) Reliable information channel (C1) Access to external agencies (C3) External factors — Excessive regulation regulation on security (C5) Excessive regulation on privacy (C8) Efficiency Efficiency of decision (dependent support (C7) variable)
Number of Cronbach's items alpha 4
0.895
3
0.832
3
0.825
6
0.843
3
0.924
4
0.751
2
0.914
3
0.754
28
employees, including about 186,000 sworn personnel. In addition, there are over 155,000 nationally registered emergency medical technicians (EMT). City, town, and county emergency managers also participated in this study. A non-probability sampling [1] was utilized because of the availability of first-responders in the local region who were interested in the result of the study. (The proportion of first-responders who responded to the surveys is similar to the proportion of national responders). Emergency managers also helped to distribute surveys to members of emergency management committees in the area. These members represent departments such as police, fire, emergency medicine, and the emergency office. Before running exploratory factor analysis and reliability checks, we checked whether the data satisfied the assumptions for factor analysis. The following three checks were performed: correlation coefficient among question items, Bartlett's test of sphericity, and the Kaiser–Meyer–Olkin (KMO)'s measure of sampling adequacy (MSA) [31,60]. The critical assumptions underlying factor analysis are more conceptual than statistical. Only normality is necessary if a statistical test is applied to the significance of the factors. The data matrix should ensure that sufficient correlations exist to justify the application of factor analysis. To determine the appropriateness of factor analysis, the examination of the correlation matrix is recommended [31,60]. In this study, we have twenty-eight question items, and 756 pair-wise correlations between the question items. Two-hundred and thirty-one pair-wise correlation coefficients showed coefficient values of 0.3 or greater (please
246
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
Table 8 Instrument for measuring the factors contributing to efficiency of CIMS
a
Reverse-coded.
note that according to statistics literature [31,60] this corresponds to acceptable values). Thus we can conclude that the data satisfies the assumption for the factor analysis. The Bartlett's test of sphericity, a statistical test for the presence of correlations among the variables, is yet
another measurement [31]. The result of Bartlett's test of sphericity in this study shows that Sig (P) = 0.000 b α(= 0.05) (χ2 = 1076.176, df = 378). The result implies that there is no evidence that the correlation matrix is an identity matrix.
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
The other measure that quantifies the degree of inter correlation among variables and the appropriateness of factor analysis is the Kaiser–Meyer–Olkin (KMO)'s measure of sampling adequacy (MSA) [31]. Values of 0.6 and above are recommended for factor analysis [2,35,56,60]. The result of KMO's MSA in this study revealed a value of 0.662. Thus KMO and Bartlett's tests both confirm that the data in this study are relevant for the factor analysis. In this step, we used a higher value of the criterion 0.6 to eliminate some question items. In this step, eight items were removed, and finally twenty-eight items remained. The instrument had eight constructs. In the remainder of this section we provide a summary of the three stages in the process of the instrument development. Table 4 shows the number of respondents who participated in each stage, and the results of analysis and discussion with the domain experts. The number of constructs varies from stage to stage. We have summarized this variation in Table 5. The different number of constructs at each stage is a consequence of deleting, adding, and splitting constructs that were manifest in the previous stage. After the pretest, Construct 3 (information sharing– process) and Construct 4 (information sharing–technical interface), were merged because respondents were not able to differentiate between the two items based on the items that were in the construct. Further after the pilot test, the construct excessive regulation was removed because of low factor loadings. This is possibly due to the effect of respondent demographics. The participants in the pretest were domain experts attending the 15th World Conference on Disaster Management. This group consisted of a large number of Canadians. The regulations vary from country to country and further attitudes towards regulations are different. Because of the variance in culture we see that some of the constructs scored low on the factor loadings. It appears that first responders outside of this country are not so concerned with regulatory issues during emergencies. However, based on input from domain experts within the first responder community in the USA, we re-added two constructs: Construct 7 (excessive regulation with regards to security) and Construct 8 (excessive regulation with regard to privacy). The experts felt that this was an important issue in the USA. During the main test, the questionnaire stabilized with these two constructs being included in the instrument. This behavior is likely because of cultural differences between responders from the USA and those from other countries. Cultural and political factors do affect instrument design. However, a more detailed examination of the impact of cultural issues needs further investigation and is an area for future research.
247
The pilot and the main test were conducted in the USA. This helped reduce cultural bias and resulted in a more stabilized instrument. 6. Results To identify the constructs, an exploratory factor analysis was performed. For each construct, reliability has been checked using Cronbach's alpha. 6.1. Factor analysis and reliability check Table 6 shows the results of the exploratory factor analysis in the final stage. Eight constructs have eigenvalues greater than 1.0 resulting from varimax rotation (items ER-1R, ER-2R, ER-3R, ER-4R, ER-5R, and ER-6R are reverse-scored). These eight constructs showed a total variance of 77.64%. We named the factors based on the results from the exploratory factor analysis. To determine the internal consistency, we performed a reliability check with Cronbach's alpha. It is recommended that factors have values greater than 0.7 in the established study and 0.6 in the exploratory study [26,35,45]. Table 7 shows the names of constructs and their corresponding number of items. It shows that all constructs are greater than 0.7. The final instrument is in Table 8. The instrument has eight constructs which consist of twenty-eight question items. Seven among eight constructs are explanatory variables which are communication capacity, early detection and response, information sharing, reliable information channels, access to external agencies, excessive regulation on security, and excessive regulation on privacy. The last variable is the dependent variable which is the efficiency of decision support in CIMS. 7. Conclusion and discussion The instrument items were created from a framework which had three components: technologies, process, and external factors. Based on the three components, question items were developed. For the technology factor, two constructs were developed: one was ‘Early detection and response’ and the other ‘Communication capacity’. Early detection and rapid response is essential to lessen the extent of damage to both life and property. Various instruments such as detectors and sensors are necessary and their proper deployment and allocation is crucial to increase the efficiency of the CIMS. To ensure smooth operation of emergency activities, sufficient communication channels should be available. Communication capacity is
248
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
designed so that telecommunication companies can operate profitably by satisfying peak demands for daily activities. During disasters, the call volume is quite extra ordinary and the demand for circuits exceeds installed capacity. This creates problems for victims and eye witnesses to call 911 or seek help. Further the first responders need additional circuits to ensure that the response and recovery process works proceeds efficiently. The issue of communication channels not only needs resolution in terms of interoperability of radio channels and the development of interoperability standards such as EXML, GJXML for device communication. For the process factor, three constructs were proposed and validated: information sharing, reliable information channel, and access to external agencies. Information sharing between first-responders and among different agencies to assist first-responders' is important for the purposes of coordination and decision support. From the perspective of information flow, the reliability of the information channel is a necessary condition. Supporting systems such as identification systems or intrusion detection systems are also essential. To get better information, access to external agencies and national resources such as HazMat and ChemTrac helps first-responders respond to emergencies efficiently. Communications and information sharing are very important for an effective response. In a civilian organization it is important for all of the parties to understand the common operating picture and share vision. This common understanding and share vision goes along way in fostering good coordination and hence good outcomes. Besides the above internal factors, several external factors play an important role. The excessive enforcement of regulations is often highly criticized as it prevents streamlining activities at incident sites. Though security and privacy are essential aspects which should be protected, overemphasis hampers information gathering and sharing which are essential for providing quick and adequate help to victims while protecting responders from exposure to victims who may be contagious. Overemphasis on privacy and security could prevent first-responders from necessary information gathering and sharing. In this regard the U.S. Department of Health and Human Services Office for Civil Rights (OCR) [http://www.hhs.gov/ocr/hipaa/] has issued a bulletin providing additional guidance regarding how the HIPAA privacy rule may apply to the disclosure of protected health information to those who are treating evacuees in shelters, particularly where business associates or their agents are involved in facilitating the provision of this information.
To summarize, this paper develops a conceptual model to explain variations in the efficiency of decision support. Further, an instrument (which was tested and validated) is also developed based on the conceptual model to assess the strengths and weaknesses of critical incident management systems. The constructs are justified using media richness theory and input from domain experts who have had over fifteen years of experience. The instrument allows communities to be get better prepared by addressing the weaknesses of their system — in that the instrument informs both policy and practice. Nevertheless, this study has some limitations. It is important to point out that the study focuses only on the systems component, and not on the people who operate the system or use it. This is being addressed as an area for further research. At the incident site, first-responders may encounter unexpected situations which may be beyond the scope of the system. To respond to critical incidents, first-responders must be trained. Experts have pointed out that training is essential for effective execution and wellorganized pre-planning [33,42,43]. A holistic system must consider psychological aspects of the first-responders. This was also observed to be one of thee major issues that affected mitigation of and recovery efforts in the aftermath of Hurricane Katrina [33]. Psychological aspects of first-responders are being incorporated into this model or as part of future research. Acknowledgements We would like to thank Mr. Gregg G. Blosat (Captain of Buffalo Police Department, NY), Mr. Roger Lander (Director of Homeland Security, Buffalo City, NY), Mr. Thomas J. Maxian (C.E.O. and General Counsel of Twin City Ambulance), Lt. Stephen McGonagle (Incident Commander, Amherst Police Department, NY), Mr. Dean A. Messing (Deputy Commissioner of Emergency Services, Erie County, NY), and Mr. James M. Quigley (First assistant chief of East Amherst Fire Department). Interviews with these experts helped us significantly. We would also like to thank the three anonymous referees for their insightful comments and the editor in chief for his encouragement of this project. This research is supported in part by a UB 2020 IRDF Grant, DoD IASP Grant No. H98230-04-1-022 and NSF under grant SGER 0705292. References [1] E. Babbie, The Practice of Social Research, Wadsworth Thomson Learning, Belmont, CA, 2001. [2] R.P. Bagozzi, Y. Yi, On the evaluation of structural equation models, Academy of Marketing Science 16 (1) (1988).
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250 [3] J.F. Barker, Interview with Mr. John F. Barker, University at Buffalo, Buffalo, NY, 2005. [4] T.A. Birkland, Natural disasters and disaster relief, in: J. Ciment (Ed.), Social Issues: An Encyclopedia of Controversies, History, and Debates, East River Books, 2006. [5] G.G. Blosat, J.C. Volkosh, Interview with Mr. Gregg G. Blosat and Mr. James C. Volkosh, University at Buffalo, Buffalo, NY, 2005. [6] J.R. Carlson, R.W. Zmud, Channel expansion theory and the experiential nature of media richness perceptions, Academy of Management Journal 42 (2) (1999). [7] E.G. Carmines, R.A. Zeller, Reliability and Validity Assessment, Sage Publications, Beverly Hills, California, 1979. [8] I. Chengalur-Smith, S. Belardo, H. Pazer, Adopting a disastermanagement-based contingency model to the problem of ad hoc forecasting: toward information technology-based strategies, IEEE Transactions on Engineering Management 46 (2) (1999). [9] D.R. Compeau, C.A. Higgins, Computer self-efficacy: development of a measure and initial test, MIS Quarterly 19 (2) (1995). [10] T.T. Cook, D. Campbell, Quasi-Experimentation: Design and Analysis Issues for Field Settings, Houghton Mifflin Company, Boston, Massachusetts, 1979. [11] R.L. Daft, R.H. Lengel, Information richness: a new approach to managerial behavior and organizational design, in: B.M. Staw, L. L. Cummings (Eds.), Research in Organizational Behavior, JAI Press, Greenwich, Connecticut, 1984. [12] R.L. Daft, R.H. Lengel, Organizational information requirements, media richness and structural design, Management Science 32 (5) (1986). [13] R.L. Daft, K.E. Weick, Toward a model of organizations as interpretation systems, Academy of Management Review 9 (2) (1984). [14] R.L. Daft, J.C. Wiginton, Language and organization, Academy of Management Review 4 (2) (1979). [15] R.L. Daft, R.H. Lengel, L.K. Trevino, Message equivocality, media selection, and manager performance: implications for information systems, MIS Quarterly 11 (3) (1987). [16] J. D'Ambra, R.E. Rice, M. O'Connor, Computer-mediated communication and media preference: an investigation of the dimensionality of perceived task equivocality and media richness, Behaviour & Information Technology 17 (3) (1998). [17] R.J. Daniels, D.F. Kettl, H. Kunreuther, Introduction, in: R.J. Daniels, D.F. Kettl, H. Kunreuther (Eds.), On Risk and Disaster, University of Pennsylvania Press, Philadelphia, PA, 2006. [18] R.J. Daniels, D.F. Kettl, H. Kunreuther, On Risk and Disaster: Lessons from Hurricane Katrina, University of Pennsylvania Press, 2006. [19] P. Davidson, Compatible radio systems would cost billions, USA Today (December 28 2005). [20] A.R. Dennis, S.T. Kinney, Testing media richness theory in the new media: the effects of cues, feedback, and task equivocality, Information Systems Research 9 (3) (1998). [21] A.R. Dennis, J.S. Valacich, Rethinking media richness: towards a theory of media synchronicity, Proc. of Hawaii International Conference on System Sciences, 1999. [22] G. Dhillon, G. Torkzadeh, Value focused assessment of information system security in organizations, Information Systems Journal 16 (3) (2006). [23] R.R. Dynes, The Importance of social capital in disaster response, University of Delaware Disaster Research Center, 2002. [24] M. El-Shinnawy, M.L. Markus, The poverty of media richness theory: explaining people's choice of electronic mail vs. voice mail, International Journal of Human-Computer Studies 46 (4) (1997). [25] A. Fink, The Survey Handbook, SAGE Publications, 2002.
249
[26] C. Fornell, D.F. Larcker, Evaluating structural equation models with unobservable variables and measurement error, Journal of Marketing Research 18 (1) (1981). [27] S. French, C. Niculae, Believe in the model: mishandle the emergency, Journal of Homeland Security and Emergency Management 2 (1) (2005). [28] P.H. Gilbert, J. Isenberg, G.B. Faecher, L.T. Papay, L.G. Spielvogel, J.B. Woodard, E.V. Badolato, Infrastructure issues for cities—countering terrorist threat, Journal of Infrastructure Systems 9 (1) (2003). [29] J.M. Grela, Interview with Mr. John M. Grela, University at Buffalo, Buffalo, NY, 2005. [30] J.H. Guy, Interview with Mr. James H. Guy, University at Buffalo, Buffalo, NY, 2005. [31] J.F. Hair, R.E. Anderson, R.L. Tatham, W.C. Black, Multivariate Data Analysis, Prentice-Hall, Inc., Upper Saddle River, New Jersey, 1998. [32] T.A. Horan, D. McCabe, R. Burkhard, B. Schooley, Performance information systems for emergency response: field examination and simulation of end-to-end rural response systems, Journal of Homeland Security and Emergency Management 2 (1) (2005). [33] D. Humbert, Interview with Mr. Dave Humbert, University at Buffalo, Buffalo, NY, 2006. [34] S. Jain, C.R. McLean, An integrating framework for modeling and simulation for incident management, Journal of Homeland Security and Emergency Management 3 (1) (2006). [35] S.S. Kahai, R.B. Cooper, Exploring the core concepts of media richness theory: the impact of cue multiplicity and feedback immediacy decision quality, Journal of Management Information Systems 20 (1) (2003). [36] L.M. Killian, An introduction to methodological problems of field studies in disasters, in: R.A. Stallings (Ed.), Methods of Disaster Research, Xlibris Corporation, Philadelphia, PA, 2002. [37] J.K. Kim, R. Sharman, H.R. Rao, S. Upadhyaya, An investigation of risk management issues in the context of emergency response systems, Proc. of Eleventh Americas Conference on Information Systems (AMCIS), Association for Information Systems, Omaha, Nebraska, USA, 2005. [38] J.K. Kim, R. Sharman, H.R. Rao, S. Upadhyaya, Framework of analyzing Critical Incident Management Systems (CIMS), Proc. of Thirty-Ninth Hawaii International Conference on System Sciences (HICSS-39), 2006, Kauai, Hawaii, USA. [39] A.S. Lee, Electronic mail as a medium for rich communication: an empirical investigation using Hermeneutic interpretation, MIS Quarterly 18 (2) (1994). [40] M. Lombardi, Media studies, in: R.A. Stallings (Ed.), Methods of Disaster Research, Xlibris Corporation, Philadelphia, PA, 2002. [41] T.J. Maxian, Interview with Mr. Thomas J. Maxian, University at Buffalo, Buffalo, NY, 2005. [42] S. McGonagle, Interview with Mr. Stephen McGonagle, University at Buffalo, Buffalo, NY, 2005. [43] D.A. Messing, Interview with Mr. Dean A. Messing, University at Buffalo, Buffalo, NY, 2005. [44] G.C. Moore, I. Benbasat, Development of an instrument to measure the perceptions of adopting an information technology innovation, Information Systems Research 2 (3) (1991). [45] J.C. Nunnaly, Psychometric Theory, McGraw Hill, New York, 1976. [46] E. Paté-Cornell, Fusion of intelligence information: a Bayesian approach, Risk Analysis 22 (3) (2002). [47] W.G. Peacock, Cross-national and comparative disaster research, in: R.A. Stallings (Ed.), Methods of Disaster Research, Xlibris Corporation, Philadelphia, PA, 2002.
250
J.K. Kim et al. / Decision Support Systems 44 (2007) 235–250
[48] W.V. Pelfrey, The cycle of preparedness: establishing a framework to prepare for terrorist threats, Journal of Homeland Security and Emergency Management 2 (1) (2005). [49] B. Phillips, Qualitative methods and disaster research, in: R.A. Stallings (Ed.), Methods of Disaster Research, Xlibris Corporation, Philadelphia, PA, 2002. [50] J.M. Quigley, Interview with Mr. James M. Quigley, University at Buffalo, Buffalo, NY, 2005. [51] J.T. Raab, Interview with Mr. Joseph T. Raab, University at Buffalo, Buffalo, NY, 2005. [52] J.H. Reger, Interview with Mr. James H. Reger, University at Buffalo, Warsaw, NY, 2006. [53] R.E. Rice, D.E. Shook, Relationships of job categories and organizational levels to use of communication channels, including electronic mail: a meta-analysis and extension, Journal of Management Studies 27 (2) (1990). [54] S.Y. Shen, M.J. Shaw, Managing coordination in emergency response systems with information technologies, Proc. of the Tenth Americas Conference on Information Systems, Association for Information Systems, New York, NY, 2004. [55] P. Slovic, Perception of risk, Science 236 (4799) (1987). [56] H.J. Smith, S.J. Milberg, S.J. Burke, Information privacy: measuring individual's concerns about organizational practices, MIS Quarterly 20 (2) (1996). [57] D.W. Straub, Validating instruments in MIS research, MIS Quarterly 13 (2) (1989). [58] K.S. Suh, Impact of communication medium on task performance and satisfaction: an examination of media-richness theory, Information & Management 35 (5) (1999). [59] C.R. Sunstein, Laws of Fear, Cambridge, New York, 2005. [60] B.G. Tabachnick, L.S. Fidell, Using Multivariate Statistics, Allyn and Bacon, Boston, USA, 2000. [61] J.D. Thompson, Organizations in Action; Social Science Bases of Administrative Theory, McGraw-Hill, New York, 1967. [62] G. Torkzadeh, G. Dhillon, Measuring factors that influence the success of Internet commerce, Information Systems Research 13 (2) (2002). [63] L.K. Trevino, R.L. Daft, R.H. Lengel, Understanding managers' media choices: a symbolic interactionist perspective, in: J. Fulk, C.W. Steinfield (Eds.), Organizations and Communications Technology, Sage, Newbury Park, CA, 1990. [64] M. Turoff, Past and future emergency response information systems, Communications of the ACM 45 (4) (2002). [65] M. Turoff, M. Chumer, S.R. Hiltz, R. Klashner, M. Alles, M. Vasarhelyi, A. Kogan, Assuring homeland security: continuous monitoring, control & assurance of emergency preparedness, Journal of Information Technology Theory and Application (JITTA) 6 (3) (2004). [66] M.L. Tushman, D.A. Nadler, Information processing as an integrating concept in organizational design, Academy of Management Review 3 (3) (1978). [67] U.S. Department of Homeland Security, National Incident Management System, 2004. [68] U.S. Department of Homeland Security, Emergencies & Disasters: About First Responders, 2006. [69] A.H. Van de Ven, A.L. Delbecq, J. Richard Koenig, Determinants of coordination modes within organizations, American Sociological Review 41 (2) (1976). [70] A. Waikar, P. Nichols, Aviation safety: a quality perspective, Disaster Prevention and Management: An International Journal 6 (2) (1997).
[71] J. Webster, J.J. Martocchio, Microcomputer playfulness: development of a measure with workplace implications, MIS Quarterly 16 (2) (1992). [72] D.v. Winterfeldt, Using risk and decision analysis to protect New Orleans against future hurricanes, in: R.J. Daniels, D.F. Kettl, H. Kunreuther (Eds.), On Risk and Disaster, University of Pennsylvania Press, Philadelphia, PA, 2006. [73] A.M.J. Yezer, The economics of natural disaster, in: R.A. Stallings (Ed.), Methods of Disaster Research, Xlibris Corporation, Philadelphia, PA, 2002. Dr. Jin Ki Kim is an assistant professor of Business Administration at the Korea Aerospace University. He received his Ph.D. in Management Information Systems from the University aft Buffalo, The State University of New York. He has an M.S. and a B.S. from Hanyang University at Seoul in Korea. Prior to joining doctoral program, he worked at the Korea Information Society Development Institute (KISDI) in Korea as a research fellow. His interests are emergency management systems, telecommunications policy and management, digital convergence, offshore outsourcing, and diffusion of new telecom technology and services. His research has appeared in the following conferences: Hawaii International Conference on System Sciences (HICSS), Americas Conference on Information Systems (AMCIS), Telecommunications Policy Research Conference (TPRC), International Telecommunications Society (ITS). Dr. Raj Sharman is an assistant professor in the School of Management at State University of New York at Buffalo. His research interests are primarily in the fields of Information Assurance, Disaster Response Management, Medical Informatics, Conceptual Modeling and Ontology, Internet Performance and Data Mining. He received his Bachelors Degree in Engineering and Masters Degree in Management from the Indian Institute of Technology, Bombay, India. He also earned a Masters Degree in Industrial Engineering, and a Doctorate in Computer Science from Louisiana State University. Dr. Sharman serves as an associate editor for the Journal of Information Systems Security. Professor H. Raghav Rao graduated from Krannert Graduate School of Management at Purdue University. His interests are in the areas of management information systems, decision support systems, e-business, emergency response management systems and information assurance. He has chaired sessions at international conferences and presented numerous papers. He also has co-edited four books of which one is on Information Assurance in Financial Services. He has authored or coauthored more than 150 technical papers, of which more than 75 are published in archival journals. His work has received best paper and best paper runner up awards at AMCIS and ICIS. Dr. Rao has received funding for his research from the National Science Foundation, the Department of Defense and the Canadian Embassy and he has received the University's prestigious Teaching Fellowship. He has also received the Fulbright fellowship in 2004. He is a co-editor of a special issue of The Annals of Operations Research, the Communications of ACM, associate editor of Decision Support Systems, Information Systems Research and IEEE Transactions in Systems, Man and Cybernetics, and coEditor-in-Chief of Information Systems Frontiers. Dr Rao also has a courtesy appointment with Computer Science and Engineering as adjunct Professor. Dr. Shambhu Upadhyaya is an associate professor of Computer Science and Engineering at the University at Buffalo, the State University of New York. His research interests are information assurance, computer security, fault diagnosis, fault tolerant computing, and VLSI Testing. His research has been supported by NSF, DARPA, NSA and AFRL.