A framework with reasoning capabilities for crisis response decision–support systems

A framework with reasoning capabilities for crisis response decision–support systems

Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎ Contents lists available at ScienceDirect Engineering Applications of Artificial ...

394KB Sizes 1 Downloads 116 Views

Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎

Contents lists available at ScienceDirect

Engineering Applications of Artificial Intelligence journal homepage: www.elsevier.com/locate/engappai

A framework with reasoning capabilities for crisis response decision–support systems Nady Slam a, Wenjun Wang a, Guixiang Xue b, Pei Wang c a

College of Computer Science and Technology, Tianjin University, Tianjin, China School of Computer Science and Software, Hebei University of Technology, Tianjin, China c Department of Computer and Information Sciences, Temple University, Philadelphia, USA b

art ic l e i nf o

Keywords: Crisis response decision–support system Non-Axiomatic Reasoning system Non-Axiomatic Logic Artificial General Intelligence

a b s t r a c t This paper reviews the methods in decision–support systems for crisis management. While much research has been conducted in this field, little emphasis has been placed on the uncertainty representation, reasoning, learning and real time decision-making capabilities of system. The purpose of this paper is to explore the basic assumptions of constructing an intelligent decision–support system for crisis response management. A novel framework for crisis response decision-making system under the assumption of openness to various kinds of uncertainties, reasoning and learning with real-time response is proposed. We applied the Non-Axiomatic Logic in representing and reasoning the uncertainty knowledge in the framework and demonstrated the reasoning and learning mechanisms of the framework through an application in a case study in the field of urban firefighting. The results show that the framework provides a suitable model for intelligent crisis response decision support systems. & 2015 Published by Elsevier Ltd.

1. Introduction The definition of “crisis” is critical to the research of crisis management. However, no generally accepted definitions of “crisis” exist. From a general perspective, the term “crisis” applies to situations that are unforeseen, urgent, and causing widespread uncertainty (Rosenthal et al., 2001; Stern and Sundelius, 2002). In this paper a “crisis” is defined as (Rosenthal et al., 1989) “a serious threat to the basic structures or the fundamental values and norms of a social system, which — under time pressure and highly uncertain circumstances — necessitates making critical decisions.” Crisis management usually consists of four basic stages including anticipation and preparation, rapid response, follow-through, and post-event evaluation (Patrick and Benjamin, 2012). The response stage of crisis management faces the challenge of how to deal with uncertain information gathered from a crisis and how to make effective decisions under time constraints. In the crisis response stage, a decision–support system can support and enhance human judgements in the performance of tasks. During the past several years, a growing number of researchers have started to take an interest in decision–support system for crisis response in order to assist the decision making process of crisis management. Roughly speaking, the literature in this field falls into three categories. One is based on Artificial Intelligence (AI); the second category depends on information technologies; while the other literature

emphasizes the cognitive processes of human decision-making in crisis response management. One of the earlier works on crisis response plan making is described by Li and Miska (1991), in which a structure was designed for a decision support system for fire management on naval vessels using production system and case-based reasoning. Methods of agent technologies have been used in this field. In work (Balducelli et al., 2000), an active decision–support system for automatic planning support in emergency management is designed based on agent technology (direct advisory agent, automatic planning agent, and info provider agent), for the purpose of providing real-time data necessary for the emergency manager's decision making. A crisis response simulation model architecture combining a discrete-event simulation environment and multiagent system is represented by Gonzalez (2009). Ontology can provide systems with a formal specification of a shared conceptualization in a specific crisis field. A context ontology system is presented by Jihan and Segev (2013) to determine the response recommendations for the humanitarian needs. To be specific, the work designed two ontologies (crisis identification ontology, crisis response ontology) for the system and used logic rules to unify them, then generate the recommendations automatically. To represent the vague information in crisis environment, some researchers use fuzzy logic (Zadeh, 1979). Such a conceptual

http://dx.doi.org/10.1016/j.engappai.2015.06.017 0952-1976/& 2015 Published by Elsevier Ltd.

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i

N. Slam et al. / Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎

2

framework of intelligent decision–support system for typhoon disaster management is demonstrated by Chen et al. (2011). The authors (Alnahhas and Alkhatib, 2012) described a combination of fuzzy set theory and temporal inference techniques to support the decision making in crisis management. A more recent application of fuzzy logic in this area is reported by Kandel et al. (2014), in which a comprehensive assessment of fuzzy techniques for disaster mitigation is presented. Management of crisis information in crisis response decision– support systems has been an active research area recently. Federal Emergency Management Information System (Robert et al., 1999) is an integrated software product that resides in the client/server architecture. The server realizes crisis data management by means of a relational database. The development of Geographical Information Science (GIS) and special database provide researchers with spatio-temporal location as the key index for crisis information management. A prototype based on the integration of GIS and remote sensing is introduced by Tan et al. (2011), which is based on the browser/ server architecture and a special database. Robert et al. (2015) describes an emergency response intelligence tool, which can automatically gather data regarding crisis events from the web and present it on an interactive map. The work (Asghar et al., 2008) suggested a model integration approach for disaster management decision–support system (DMDSS) and proposed a new classification scheme for the selection of modular subroutines which are collected from the traditional decomposition of DMDSS models. In recent years, some literature emphasizes the cognitive processes of human decision-making in crisis management (David, 2007) rather than the analytical processes. By analyzing the cognitive process of improvisation in the art of jazz, a model of improvisation in emergency management is demonstrated by David and William (2007). They considered improvisational decision-making as problem solving and built a computational model using declarative and procedural knowledge, where the former is represented using domain ontology, and the latter is implemented using decision logic. Naturalistic Decision Making (NDM) (Klein and Calderwood, 1991) is a hot research trend to investigate decision making in a naturalistic environment, and the Recognition-Primed Decision (RPD) (Klein, 1993) is a typical NDM model which attempts to explicitly describe how a human decision maker deals with complex tasks based on their experience. A computational form of RPD, called C-RPD, is presented in work (Nowroozi et al., 2012), in which a Belief-Desire-Intention (DBI) agent is utilized to build a computational RPD model. Although a number of decision–support systems have been developed for crisis response, several specific issues continue to impede the implementation of systems. Some challenges we identified are

 Knowledge representation and reasoning capabilities: An intelli-



gent crisis response decision–support system should manage representation and reason on crisis knowledge, particularly under uncertainty. However, according to our survey, most systems in crisis response either lack an effective knowledge representation scheme, or have no reliable inference mechanism that can reason on information with different types and degrees of uncertainties. Learning capability: An intelligent crisis response decision– support system should have the adaptability to accommodate changes in the environment with learning capability. However, according to our survey, the learning ability of a system is not well-studied, and most of systems lack learning capability to automatically update their knowledge.

 Real-time response capability: An intelligent crisis response



decision–support system should assist decision makers to make rational decisions in real time. However, little work has addressed the time-critical characteristics of crisis response decision-making in computer systems. Generality: Crisis response decision–support systems should provide a general model to handle disasters with different characteristics. No prior work achieves the generality of the crisis response decision–support system. Researchers in our survey have designed various crisis response decision–support systems based on crises types and their individual needs, most of them are tailored for a specific situation, and cannot be easily modified to handle other crises.

We believe that an intelligent decision–support system in crisis response management should be constructed under the requirements of openness to various kinds of uncertainties, reasoning and learning with real-time response. To fulfill these requirements, we propose a conceptual framework for crisis response decision– support systems with reasoning and learning capabilities. We applied the Non-axiomatic Logic (NAL) in representing and reasoning on uncertainty knowledge and demonstrated the reasoning and learning mechanisms of the framework through an application in a case study in the field of urban firefighting. NAL can represent several types of uncertainty in a normative approach and can carry out multiple reasoning rules on these uncertainties. NAL is the logic part of Non-Axiomatic Reasoning System (NARS)1 (Wang, 2013). What makes NARS different from conventional reasoning systems is that it is designed according to the theory that “intelligence” is the ability of a system to adapt to the environment while working with insufficient knowledge and resources. The unified knowledge representation, reasoning, learning and the real-time mechanism, as well as the generality of NARS provide many implications for designing and realizing decision– support systems in crisis response management. This paper is organized as follows. First, the basic assumptions of constructing an intelligent decision–support system for crisis response management are proposed, and a novel framework under the assumption of openness to various kinds of uncertainties, reasoning and learning with real-time response capabilities is introduced. The following section then briefly introduces a new AI technique, Non-Axiomatic Reasoning System, especially its advantage in representing and reasoning with different types of uncertainties in crisis knowledge management, as well as its reasoning and learning mechanism. Next, we described a case of urban firefighting as an example of how this framework can be used. At the end, the conclusions and lessons learned are summarized.

2. A framework for crisis response decision–support system We believe that the crisis response decision–support system should be designed with the following main assumptions:

 Openness to various kinds of uncertainty assumption: The crisis environment is characterized by uncertainty of various types, including randomness, fuzziness, ignorance and inconsistency, therefore within this framework we should consider the capability of uncertainty representation and processing. Randomness means the events happen with different probability. For example, in the urban firefighting domain, the system may assign different probabilities for different types of fire to occur. 1

https://sites.google.com/site/narswang/home/nars-introduction.

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i

N. Slam et al. / Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎





Fuzziness means that the system should have the ability to handle ill-defined categories that have graded membership. System should have the ability to combine inconsistent evidence observed from different sources. Reasoning and learning assumption: The system should possess reasoning and learning capability to semi-automatically or automatically create and update domain knowledge through communication with its environment. For example, in the urban firefighting domain, the system should be able to revise its judgments about the fire by new evidence, and learn new beliefs, concepts and skills about firefighting. Real-time response assumption: The system should accept a task at any time. Since the utility value of decision-making in a crisis is time-related, the system should produce a plan as soon as possible. For example, in the urban firefighting domain, the system should provide decision maker with “anytime” responses regarding to a fire event.

We have stipulated a conceptual framework under the above assumptions for a crisis response decision–support system, as depicted in Fig. 1. The framework includes three main components, which are a learning mechanism, a reasoning mechanism, and a knowledge base. Normally, the framework communicates with its environment and accepts evidence about some specific crisis event at any time. The learning mechanism automatically creates and updates “domain schema” in a specific domain based on interaction with the environment and the result of reasoning. The domain knowledge in a specific crisis such as urban firefighting is not coded into the knowledge base, but is gradually formed by the learning and reasoning. In short, the system accumulates knowledge from the given cases and learns from its own experience. Domain schema is revised by various inference rules during the reasoning loop, which provides a reliable inference mechanism that can reason on information with different types and degrees of uncertainty. After some time, the framework's “domain schema” in some crisis fields comes to a relatively stable condition. As a result, specific domain knowledge in some crisis fields is formed. Besides this automatic learning capability, the system also accepts external knowledge provided by domain experts. When a crisis event occurs, the reasoning mechanism creates a plan according to the domain knowledge that best fits the evidence in real time. Within the time constraint, the system may get evidence from the environment at different moments, and go through multiple iterations to revise its plan.

3

3. NARS overview Artificial General Intelligence (AGI)2 is a new frontier field of artificial intelligence and has received significant attention in recent years. Research on AGI considers intelligence as a whole. NARS is an AGI project aimed at a general-purpose intelligent system. NARS is built in the framework of a reasoning system, with a logic part and a control part. Limited by the paper length, the following description only briefly covers the aspects of the logic part, which consists of a categorical language (Wang and Hofstadter, 2001), an experience-grounded semantics (Wang, 2005) and a set of inference rules (Wang, 2000a) that also carries out learning capability (Wang, 2000b). We introduce the uncertainty measurement, reasoning and learning mechanism and real time response capabilities in NARS in the following subsections. 3.1. Uncertainty measurement in NARS NARS can consistently represent several types of uncertainty, as well as carry out multiple operations on these uncertainties. The logic used in NARS is Non-Axiomatic Logic (NAL), which is defined on a formal language Narsese. The experienced-grounded semantics links the language to the environment. Accordingly, the meaning of any term and truthvalue of any statement are determined by the available evidence (Wang, 2009). Each piece of knowledge is represented in the form of “S r P (f, c)” in Narsese, where S is the subject term of the judgment, P is the predicate term, and r is a copula that represents a conceptual relation. The most fundamental copula is “inheritance”, symbolized as “-”. For instance, “oilfire-fire” corresponds to “Oil-fire is a type of fire”. In this statement “oil-fire” is the subject term and “fire” is the predicate term. The truth-value (f, c) is a pair of real numbers defined by the amounts of evidence. The frequency and confidence in all the truth-values (of the premises and the conclusion) are treated as extended Boolean variables that take values in [0, 1]. Here f is called “frequency”, and its value is defined as the ratio w þ =w, and c is called “confidence” and defined as w/(wþ k). In these definitions, w þ and w are the amounts of positive and total evidence, respectively, and k is the constant amount of future evidence. To be specific, frequency indicates the direction of a belief, where a value near ‘1’ means the belief is affirmative (or positive), while a value near ‘0’ means the belief is dissenting (or negative). Confidence indicates the stability of a belief, that is, a value near ‘1’ means the belief is already based on a large amount of evidence (so is insensitive to new evidence), while a value near ‘0’ means the belief is based on little evidence (so is sensitive to new evidence). For instance, the sentence “oilfire-fire ð0:9; 0:9Þ” is mainly based on positive evidence, while the same sentence with truth value (0.2,0.9) is mainly based on negative evidence. 3.2. Reasoning and learning mechanism in NARS The system provides various kinds of inference rules such as deduction, induction and abduction (Wang, 2005, 2013), which derive new knowledge from existing knowledge. Each of these rules has its truth-value function that calculates the truth-value of the conclusion according to the evidential support provided by the premises. The truth-value functions of NAL are all derived according to the semantics of NAL, and are explained in detail in Wang (2013). In this paper, we only cite the functions, with some intuitive justification.

Fig. 1. A framework for crisis response decision–support system.

2

http://www.cis.temple.edu/  pwang/Writing/AGI-Intro.html.

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i

N. Slam et al. / Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎

4

Table 1 Revision rule with truth-value function.

Table 2 Conjunction rule with truth-value function.

fM ðf 1 ; c1 Þ; M ðf 2 ; c2 Þg ‘ Mðf ; cÞ f ¼ ½f 1 c1 ð1  c2 Þ þ f 2 c2 ð1 c1 Þ=½c1 ð1  c2 Þ þ c2 ð1  c1 Þ c ¼ ½c1 ð1  c2 Þ þ c2 ð1  c1 Þ=½c1 ð1 Þc2 Þ þ c2 ð1  c1 Þ þ ð1  c1 Þð1  c2 Þ

fS1 ðf 1 ; c1 Þ; S2 ðf 2 ; c2 Þg ‘ fS1 ⋀S2 gðf ; cÞ

If at a given moment there are two judgments that have same content, but different truth-values, the revision rule is applied to merge their evidences. The truth-value of the conclusion of revision is calculated by the truth-value function in Table 1, the function is derived from the additivity of amount of evidence, and its relationship with truth-value. For example, if judgments (1) and (2) come from different sources (the content means “Event-1 is a fifth-level fire”)3 (1) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:61Þ (2) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:81Þ the revision rule can take them as premises and obtains sentence (3) (3) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:85Þ If at a given moment the system comes to two conflicting conclusions that cannot be merged, the choice rule is called upon to select one judgement after comparing their expectation value, which is calculated by the truth-value function in Eq. (1) from the truth-values of the two premises e ¼ cðf  0:5Þ þ 0:5:

f ¼ f 1 f 2 c ¼ c 1 c2

The sentences above can be rewritten in object level as follows using conjunction copula between sentences according to Table 2. The conjunction of statements is a compound statement, and the truth-value of the conjunction is determined by those of the components, as explained in Wang (2013). (9) ðfevent1g-ðordinarybuildingfireÞ ð1:0; 0:81Þ, fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:91ÞÞ ‘ ðfevent1gðordinarybuildingfireÞ⋀fevent1g-ð½fifthlevel \ fireÞÞ ð1:0; 0:74Þ

“Event-1 is a fifth-level and ordinary building fire.” Two higher-order copulas, implication and equivalence, are used to express derivation relations among statements. For example, the following sentence corresponds to “Event-1 is a fifth-level fire if the burning-volume-more-than-ten-thousand-cubicmeters”. Here “)” is the implication copula. (10) fevent1g-½burningvolume morethantenthousandcubicmeters ) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:9Þ In NAL equivalence is defined as a symmetric implication. For instance, the sentence (11) associates two features of the fire event1 to each other, using equivalence copula “ 3 ”.

ð1Þ

For example, if judgments (4) and (5) provide different evaluations to the level of a fire, (4) fevent1g-ð½fourthlevel \ fireÞ ð1:0; 0:81Þ (5) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:81Þ Now the system has no preference between the two, since they have the same truth value. However, if the system takes (3) into account, it will be combined with (5) to get a more confident conclusion (6) (6) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:91Þ Then the choice rule compared the expectation value of (4) and (6), and take the one with higher expectation value (6) as conclusion. We have introduced the part of NAL in a urban firefighting domain in our recent paper (Nady et al., 2014). In this paper we mainly focused on how to represented statements in this field using more complicated structures, high order copula and to carry out hypothetical inference on abstract symbols. NAL can express compound statements, as well as carry out inference on such statements. For example, sentences (7) and (8) (7) fevent1g-ðordinarybuildingfireÞ ð1:0; 0:81Þ

(11) ðfevent1g-½burningareamorethanonethousand squaremetersÞ 3 ðfevent1g-½ burningvolumemore thantenthousandcubicmetersÞ ð1:0; 0:45Þ Variable terms are used as symbols for other terms. In inference rules, variable terms can be introduced, unified, or eliminated. With variable terms, the system can carry out hypothetical inferences on abstract symbols, so as to support several advanced types of inference. An independent variable represents any unspecified term under a given restriction and it is named by a word preceded by ‘$’. For example, the sentence following corresponds to “If some fire event occurs an ordinary building, then it is an ordinary building fire.” (12) ðn$event; ordinarybuildingÞ-ð=locatedatÞ ) $event -ð½ordinarybuildingfireÞ ð1:0; 0:9Þ To represent a conceptual relations R between terms A and B that is not one of the copulas, in Narsese the following three statements can be used:  ðn; A; BÞ-R  A-ð  B-ð

/, /,

⋄; BÞ A; ⋄Þ

“Event-1 is an ordinary building fire.”

Here ‘n’ is the “product” operator, ‘/’ is the “image” operator, and the symbol “⋄” is the place-holder in the image of a relation, therefore statement above can also be written as

(8) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:91Þ

ð12Þ0 ð$event-ð=locatedat⋄ordinarybuildingÞÞ

“Event-1 is a fifth level fire.” 3 Limited by the paper length, the grammar of Narsese is only briefly explained here. For the details of the grammar, including the usage of the symbols, see Wang (2013).

) $event-ð½ordinarybuildingfireÞ ð1:0; 0:9Þ

Since a variable represents another term, one common operation in variable-related inference is “substitution”. Most of the

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i

N. Slam et al. / Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎

inference rules defined in NAL can be extended to take statements with variables as premises or conclusions, by applying a proper substitution on the premises or on the conclusion. Consequently, most of the new inference rules with variables are obtained by adding unification and substitution to the variable-free rules. The deduction independent-variable elimination rules are given in Table 3, and it can be seen as carrying a substitution followed by an inference defined without variable. For example, based on the statements ð12Þ0 and (13) the system can come to a conclusion as (14) using independent-variable elimination deduction rules. (13) fevent1g-ð= locatedat⋄ordinarybuildingÞ ð1:0; 0:9Þ “Event-1 located at an ordinary building.”

5

personnel and equipment would be needed to fight the blaze, so officials sent an additional 7 Public Security fire teams (there are about 50 members in each team), 42 firefighting tankers, and 98 firefighters. All together, there were about 515 firefighters, 60 tanker trucks and 2 long extension ladder trucks involved in the rescue activities. For a case like this, the process of making a response plan can be divided into two stages:

 Estimating the level and type of the fire, according to the 

detection evidence on the fire level and the fire type assessment rules. Suggesting a response plan according to the type and level of the fire based on the rule of assigning personnel and equipment for the fire.

(14) fevent1g-ð½ordinarybuildingfireÞ ð1:0; 0:81Þ “Event-1 is an ordinary building fire.” Another feature of NAL is its unified treatment of reasoning and learning (Wang, 2000b, 2013). Most of the learning capability in the system is carried out by the inference rules, so there is no separate “reasoning module” and “learning module”. The system's learning capability is displayed in several forms: the system can derive new belief from existing beliefs using non-deductive inference (induction, abduction, analogy, and so on), as well as revise existing beliefs according to new evidence. The system can also compose new compound terms to capture repeatedly appeared patterns in its experience. Furthermore, it can adjust the relative priority of the beliefs and concepts according to their experienced usefulness and relevance. Since this paper is focused on the reasoning aspect of NAL, for the learning aspects please see Wang (2006, 2013). In short, NAL consistently represents several types of uncertainty in a normative model, and carries out multiple types of inference on uncertain beliefs. The learning process of it is integrated with reasoning, that is the system is able to revise or update the previous knowledge on statement, and to generate new knowledge in each step. Meanwhile, NARS can handle new knowledge and problems, which may come to the system at any moment, and a problem usually has a time requirement for its solution. Therefore, its real-time mechanism is different from the most current AI systems which do not consider time constraint at run time. Furthermore, NARS can provide a clear separation between the domain-independent design of the system and its domain-specific content, without changing the design.

4.1. Domain knowledge in urban firefighting

4. A case study in urban firefighting

“The burning area of event-1 is more than 1000 m2.”

In the framework, domain knowledge is used in inferring the solution of a responder when facing a fire report. The domain knowledge is a standard operating procedure or a routine learned from experience (domain schema) which forms the basis for a new course of action. In the test, the urban fire rescue rules are described according to Narsese grammar (see the appendix). These rules are used to decide for fire level and fire type and then to assign people and equipment to the fire, as shown in Fig. 2. 4.2. Case processing Case 1: In this case, we assume that the system communicates with the user and receives reports regarding the fire. We also assume that the system has domain knowledge about urban firefighting. The system creates its domain schema about this fire, and then update it based on new evidence coming to the system from time to time. The following factual information is given to the system, where each Narsese sentence is followed by its English translation. 4.2.1. Determining the level of fire When determining the level of fire, the system initially depends on the reports on the area and volume of the building where the fire occurred. (1) fevent1g-½burningareamore thanonethousandsquaremeters ð1:0; 0:9Þ

In this section we illustrate a case of a representative fire through the framework on Section 3, and we apply Non-Axiomatic Logic to demonstrate the reasoning mechanism of the framework. On the 15th of February 2004, a big fire occurred at the Zhong Bai Mall in the city of Jilin, China. The mall has four floors. The area of the building is 4328 sq. meters and its height is 20.65 meters, thus giving it a volume of 89,373 cu. meters. There were 190 people trapped inside. After receiving the initial call, the Fire Command Center of Jilin sent 67 firefighters, 18 tanker trucks, and 2 long extension ladder trucks to the mall. However, it was soon determined that more Table 3 Independent-variable elimination deduction rule. f$x-M ) $x-P ðf 1 c1 Þ; S-Mðf 2 c2 Þg ‘ S-P ðf ; cÞ f ¼ f 1 f 2 ; c ¼ f 1 f 2 c 1 c2

Fig. 2. Urban firefighting domain knowledge. (H: high-rise-building fire, O: oil fire, G: gas fire, T: toxic-gas fire, U: underground-space fire, R: ordinary-building fire. 1 to 5 represented the level of fire from low to high.)

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i

N. Slam et al. / Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎

6

(2) fevent1g-½burningvolumemore thanonethousandcubicmeters ð1:0; 0:9Þ “The burning volume of event-1 is more than 10,000 m3.” Assuming the system also knows that firefighting procedural knowledge (1) and (2) in appendix, therefore from report (1) and rule (1), from report (2) and rule (2), the system reaches the following conclusions (3) and (4) respectively, using the independent-variable elimination deduction rule

4.2.2. Determining the type of fire The recognition of fire type of fire heavily depends on the location of fire. If the system receives the following report: (10) fevent1g-ð=locatedat⋄mallÞ ð1:0; 0:9Þ “Event-1 is located at a mall.” And it is then used with the following domain knowledge: (11) mall-ordinarybuilding ð1:0; 0:9Þ

(3) fevent1g-ð½fifthlevel \ fireÞ ð1:00; 0:81Þ “Event-1 is probably a fifth-level fire.”

“Mall is a kind of ordinary-building.” The system comes to the following conclusion using the deduction rule:

(4) fevent1g-ð½fifthlevel \ fireÞ ð1:00; 0:81Þ (12) fevent1g-ð=locatedat⋄ordinarybuildingÞ ð1:0; 0:81Þ “Event-1 is probably a fifth-level fire.” Since (3) and (4) came from disjoint bodies of evidence, the revision rule merge them, and obtains (5) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:9Þ

“Event-1 is located at an ordinary building.” Assuming the system also knows that firefighting domain knowledge (5) in appendix, from (12) and rule (5) the system makes the following conclusion using the independent-variable elimination deduction rule

“Event-1 is a fifth-level fire.” In this step, we provide the system with two pieces of positive evidence (the area and volume of the fire) relating to event-1. After inference, the system concludes that the event-1 is a fifth level fire, since the characteristics of the event-1 match that of the category. Now, assuming the system receives another report regarding the amount of trapped people in the event

“Event-1 is a ordinary-building fire.” Based on the fire type, the system now needs to determine the spread speed of the fire. Assuming that the system received the following reports from different observer:

(6) fevent1g-ð½largenumbertrappedpeopleÞ ð1:0; 0:9Þ

(14) ordinarybuildingfire-ð=spreadspeed⋄½ rapid ÞÞ ð0:6; 0:7Þ

“There are large number of trapped people in event-1.” At that time, report (6) matches the rule of (3) and (4) in appendix, therefore the system reasons the following two conclusions:

“The spread-speed of ordinary-building fire may be rapid.”

(7) fevent1g-ð½fourthlevel \ fireÞ ð1:0; 0:81Þ “Event-1 is a fourth-level fire.” (8) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:81Þ “Event-1 is a fifth-level fire.” Here the system comes to two candidate answers (7) and (8) to the level of fire. To resolve this conflict, the system combines judgments (8) and (5) firstly by the revision rule and comes to conclusion

(13) fevent1g-ðordinarybuildingfireÞ ð1:0; 0:73Þ

(15) ordinarybuildingfire-ð=spreadspeed⋄½ rapid ÞÞ ð0:8; 0:8Þ “The spread-speed of ordinary-building fire is probably rapid.” From the available evidence relating to the spread speed of fire, the system considers that the spread speed of ordinary building fire probably rapid, since it corresponds to the most confident evidence. After above steps, system comes to the conclusion as follows by using deduction rule between (13) and (15) (16) ðfevent  1g-ð=spreadspeed⋄½rapidÞÞ ð0:8; 0:47Þ “The spread-speed of event-1 is probably rapid. ” The system now has a domain schema (13) and (16), and according to conjunction rule the system arrives at the following conclusion:

(9) fevent1g-ð½fifthlevel \ fireÞ ð1:0; 0:93Þ “Event-1 is a fifth-level fire.” Then makes a selection based on choice rule between (9) and (7). After calculating the expectation values, the choice rule chooses the candidate (9) since it has a higher expectation value. In this way, with the coming of additional evidence, the confidence level of system regarding level of event-1 becomes higher. In this step, the system applied the independent-variable elimination deduction rule in order to eliminate the variable in domain knowledge, while the revision and choice rules are applied to handle inconsistency evidence in its domain schema. In this process, in addition to the system updating its domain schema by considering new evidence, as well as it gives a conclusion to the user at any time after a more reliable answer is found.

(17) ðfevent1g-ordinarybuilding fireÞ⋀ðfevent1g-ð=spreadspeed⋄½ rapid ÞÞ ð0:7; 0:34Þ “Event-1 probably is an ordinary-building fire, and its spread speed is probably rapid.” In this step, we provide the system with four reports on the type of event-1. Among them the reports regarding to the location of the fire can be uncertain because of the randomness involved, while the other two regarding the spread speed of fire can be uncertain because of the fuzzy categories used in it. Here the frequency value is more similar to a “degree of membership” as discussed in fuzzy logic (Zadeh, 1979; Wang, 1996) than a probability value in the usual sense. This two types of uncertainty are combined into truth-values in NAL. Domain schema of system is updated by inference rules such as deduction, independentvariable elimination deduction and combination rules as well.

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i

N. Slam et al. / Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎

4.2.3. Suggested solution The system suggests a solution to event-1 according to its type and level concluded from step A and B. The system now has a domain schema (9) and (17), and according to conjunction rule the system gets the following conclusion:

 Theoretical assumptions: The main contribution of this paper is

(18) ðfevent1g-ð½ fifthlevel  \ fireÞ⋀ðfevent  1gðordinarybuildingfireÞ⋀ðfevent1g-ð=spread speed⋄½ rapid ÞÞÞð0:7; 0:32Þ “Event-1 is probably a fifth level ordinary building fire and the spread speech of fire is probably rapid.” Then assuming the system also knows rule (6) in appendix, the response plan (19) is generated (19) fevent  1g-ðsolution-ðð=send⋄ðonethousandfiremenÞ \ ðeighteenwatertankersÞ \ ðthreelongextension laddertrucksÞ ð0:7; 0:2Þ

“You probably should send at least one thousand firemen, eighteen water tankers and three long extension ladder trucks as response to event-1” At the stage of suggesting solution to the fire, the system finally created a plan according to its domain schema and domain knowledge by conjunction and independent-variable elimination deduction rules. We can see that the response plan has a relatively high frequency value, but relatively low confidence value. This means that the system does not have very high level confidence in the solution. At the current stage, the goal of case study is not to reach the same conclusions as the domain experts, since there are still many factors not included in the decision making yet. Instead, it is to confirm that the factors that the system does consider are indeed processed in a reasonable way. Case 2: In this case we introduce the ability of generating a fire solution knowledge based on learning. The system can revise its domain knowledge by new evidence (data, experience), and learn new beliefs, concepts and skills. As a result, the agent gradually forms fire response rules. For example, from (1) and (2) in case 1, system utilizes comparison rule and generate the following hypothesis: (20) f$eventg-ð½burningareamorethanonethousand squaremetersÞ 3 ðf$eventg-½ðburningvolumemore thanonethousandcubicmetersÞ ð1:0; 0:45Þ This hypothesis is produced by the comparison rule, and it associates two features of an event to each other, which may never have existed in the system's beliefs before. Compared to the deductive rules, the non-deductive rules of NAL (induction, abduction, comparison, etc.) are “weak” in the sense that in each step only one piece of evidence is considered, so the confidence of their conclusions is relatively low. It is only after many supporting evidence are accumulated by the revision rule can the conclusion get a high confidence value, and therefore become stable. When a new low-confident belief is created, usually the system cannot fully determine its usefulness. What the system does is to give each belief (either given or derived) a priority value, which will then be adjusted according to its usefulness. Here “learning” and “reasoning” are carried out by the same underlying process. 4.3. Comparison and discussion The framework for crisis decision support systems with reasoning and learning capabilities proposed in this paper differs from others reported in the literature in the following aspects:

7



to propose a framework for crisis decision–support systems under the assumption of openness to various kinds of uncertainty, having reasoning and learning capabilities with realtime response. In order to examine the effect of the conceptual framework, a real case in urban firefighting field is studied and the results support our assumptions. A conceptual model for critical incident management systems is proposed in Kim et al. (2007). This work explains variations in the efficiency of decision support from the perspective of technologies, process, and external factors. In comparison to this work we highlighted the theoretical assumptions for constructing an intelligent decision support system for crisis management. New AI technology: In comparison to other decision–support systems for crisis response in literature review, we applied new AI technology, NAL, to represent crisis knowledge under uncertainties. There are significant differences in uncertainty representation method compared to the Bayes approach (Russell and Norvig, 1995), the fuzzy logic (Zadeh, 1979), and the Dumpster– Shafer theory (Shafer, 1976), NAL combine various measurements of different types of uncertainty into a unified treatment (Wang, 1995). Meanwhile, the new AI technique provides the system with various kind of reasoning rules such as comparison, revision, and choice, rather than deduction only. The results obtained in above case study demonstrate that the framework under the assumptions of openness to various kinds of uncertainty, reasoning and learning with real-time response capabilities can be realized using the uncertainty representation measurement and reasoning rules of this new AI technique.

5. Conclusion In the past, owing to all kinds of crises whether — natural or technological — happening more and more frequently, most decision–support systems for crisis response were designed according to theories and technologies of information and artificial intelligence. In this paper, we presented a brief overview of decision– support systems for crisis response and identified the challenges which are serious obstacles to the effectiveness of these systems. To be specific, the traditional systems not only lack proper representation and reasoning ability to handle uncertainty, but also cannot adapt to the environments through learning as required by a crisis decision-making environment. Besides, most of these systems ignored the requirements on real-time response and generality. In order to address the above issues, we presented the design assumption of crisis response decision–support system and propose the corresponding framework with reasoning and learning capabilities. In addition, the real-time response and generality of system are considered in this framework as well. Artificial general intelligence (AGI) technologies have shown a remarkable trend since the beginning of this century and can be accounted for complex crisis situations. Non-axiomatic reasoning system, as an AGI, and its logic part Non-Axiomatic logic, have been applied in the knowledge representation and reasoning of the framework. We have shown the reasoning and learning capability of framework in an urban firefighting domain using this AI technology. The preliminary results show that the framework provided a promising approach for intelligent decision–support systems in crisis management. Our future study is to implement a crisis response decision–support system with self-learning and real-time features based on this conceptual framework.

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i

N. Slam et al. / Engineering Applications of Artificial Intelligence ∎ (∎∎∎∎) ∎∎∎–∎∎∎

8

Acknowledgments This work was supported by the Major Project of National Social Science Fund (No.14ZDB153), and the major research plan of the National Natural Science Foundation (v91224009, 51438009). Appendix A. Sample domain knowledge in Urban firefighting (1) f$eventg-½burningareamorethanonethousandsquare meters ) f$eventg-ð½fifthlevel \ fireÞ ð1:0; 0:9Þ “If some fire incident's burning area is more than one thousand square meters, then it is a fifth level fire.” (2) f$eventg-½burningvolumemorethan tenthousandcubicmeters ) f$eventg -ð½fifthlevel \ fireÞ ð1:0; 0:9Þ “If some fire incident's burning volume is more than ten thousand square meters, then it is a fifth level fire.” (3) f$eventg-½largenumbertrappedpeople ) f$eventg -ð½fourthlevel \ fireÞ ð1:0; 0:9Þ

“If there are large number of trapped people in some fire incident, then it is a fourth level fire.” (4) f$eventg-½largenumbertrappedpeople ) f$eventg-ð½fifthlevel \ fireÞ ð1:0; 0:9Þ

“If there are large number of trapped people in some fire incident, then it is a fifth level fire.” (5) f$event-ð=locatedat⋄ordinary building Þ ) ð$event-ðordinarybuildingfireÞ ð1:00; 0:90Þ “If some fire event occurs in an ordinary building, then it is a ordinary building fire.” (6) f$eventg-ð½fifthlevel \ fireÞÞ⋀ðf$eventg-ðordinarybuilding fireÞÞ⋀ðf$eventg-ð=spreadspeed⋄½ rapid ÞÞ )ð=send⋄ðone thousandfiremen \ eighteen water tanker \ three long extension ladder trucksÞ-solutionð1:0; 0:9Þ “If the fire incident's type is a fifth level fire and it is a ordinary building which has rapid spread speech then the solution is to send at least one thousand firemen and eighteen water tanker and three long extension ladder trucks as response.” References Alnahhas, A., Alkhatib, B., 2012. Decision support system for crisis management using temporal fuzzy logic. In: 6th International Conference on Application of Information and Communication Technologies (AICT); 17–19 October, Georgia, Tbilisi. Asghar, Sohail, Alahakoon, Damminda, Churilov, Leonid, 2008. Categorization of disaster decision support needs for the development of an integrated model for DMDSS. Int. J. Inf. Technol. Decis. Mak. 7, 115–145.

Balducelli, C., Costanzo, G.D., Gadomski, A., 2000. A prototype of an active decision support system for automatic planning support in emergency management. In: Seventh Annual Conference of the International Emergency Management Society (TIEMS 2000), Orlando, Florida. Chen, W.K., Sui, G.J., Tang, D.L., 2011. A fuzzy intelligent decision support system for typhoon disaster management. IEEE Int. Conf. Fuzzy Syst., June 27–30, Taipei, Taiwan. David, M., 2007. Decision support for improvisation in response to extreme events: learning from the response to the 2001 World Trade Center attack. J. Decis. Support Syst. 43 (3), 952–967. David, M., William, W., 2007. A cognitive model of improvisation in emergency management. IEEE Trans. Syst. Man Cybern. Part A: Syst. Hum. 37 (4), 547–561. Gonzalez, A., 2009. Crisis response simulation combining discrete-event and agentbased modeling. In: 6th International Conference on Information Systems for Crisis Response and Management: Boundary Spanning Initiatives and New Perspectives. Jihan, H., Segev, A. 2013. Context ontology for humanitarian assistance in crisis response. In: Proceedings of the 10th International ISCRAM Conference, May 12–15, Baden-Baden, Germany. Kandel, A., Tamir, D., Rishe, N.,D., 2014. Fuzzy logic and data mining in disaster mitigation. In: Horia, N., Alan, K. (Eds.), Improving Disaster Resilience and Mitigation-IT Means and Tools. Springer, Netherlands, pp. 167–186. Klein, A., 1993. A recognition-primed decision (RPD) model of rapid decision making. Decis. Mak. Act.: Models Methods 5, 138–147. Klein, A., Calderwood, R., 1991. Decision models: some lessons from the field. IEEE Trans. Syst. Man Cybern. 21 (5), 1018–1026. Kim, J.K., Sharman, R., Rao, H.R., Upadhyaya, S., 2007. Efficiency of critical incident management systems: instrument development and validation. Decis. Support Syst. 44 (1), 235–250. Li, K., Miska, E., 1991. Fire-fighter: a decision support system for fire management. In: IEEE Pacific Rim Conference on Communications, Computers and Signal Processing, May 9–10, Victoria, BC, Canada. Nady, S., Wenjun, W., Wang, P., 2014. Improvisational decision-making agent based on non-axiomatic reasoning system. In: The IEEE/WIC/ACM International Conference on Intelligent Agent Technology, August 11–14, Warsaw, Poland. Nowroozi, A., Shiri, M.E., Aslanian, A., Lucas, C., 2012. A general computational recognition primed decision model with multi-agent rescue simulation benchmark. Inf. Sci. 187, 52–71. Patrick, L., Benjamin, T., 2012. How crises model the modern world. Int. J. Risk Anal. Crisis Response 2, 21–33. Robert, B., Richard, C., Timothy, D., 1999. Federal Emergency Management Information System Administration Guide Version 1.5.3. 〈http://www.osti.gov/scitech/ search.jsp〉. Robert, P., Bella, R., Catherina, W., David, R., Geoffrey, S., Michael, C., 2015. The emergency response intelligence capability tool. In: Proceedings of 11th IFIP WG 5.11 International Symposium, March 25–27, Australia. Rosenthal, U., Boin, A., Comfort, L., 2001. Managing Crises: Threats, Dilemmas, Opportunities. Charles C. Thomas, Springfield, MA. Rosenthal, U., Charles, M., Hart, P., 1989. Coping with Crises: the Management of Disasters, Riots and Terrorism. Charles C. Thomas, Springfield, MA. Russell, S., Norvig, P., 1995. Artificial Intelligence – A Modern Approach. Englewood Cliffs, Prentice-Hall. Stern, E., Sundelius, B., 2002. Crisis management Europe: an integrated regional research and training program. Int. Stud. Perspect. 3, 71–88. Shafer, G., 1976. A Mathematical Theory of Evidence. Princeton University Press, Princeton. Tan, Qing-quan, Liu, Q., Bo, T., et al. 2011. An earthquake emergency command system based on GIS and RS. In: Proceedings of International Conference on Information Systems for Crisis Response and Management (ISCRAM), 25–27 November, Lisbon, Portugal. Wang, P., 1995. Reference classes and multiple inheritances. Int. J. Uncertain. Fuzziness Knowl.-based Syst. 3, 79–91. Wang, P., 1996. The interpretation of fuzziness. IEEE Trans. Syst. Man Cybern. Part B: Cybern. 26 (4), 321–326. Wang, P., 2000a. Unified inference in extended syllogism. In: Flach, P., Kakas, A. (Eds.), Abduction and Induction. Kluwer Academic Publishers, Dordrecht, pp. 117–129. Wang, P., 2000b. A Logic of learning. The AAAI workshop on New Research Problems for Machine Learning, 37–40, Austin, Texas. Wang, P., Hofstadter, D., 2001. The logic of categorization. J. Exp. Theor. Artif. Intell. 18 (2), 193–213. Wang, P., 2005. Experience-grounded semantics: a theory for intelligent systems. Cognit. Syst. Res. 6, 282–302. Wang, P., 2006. Rigid Flexibility: The Logic of Intelligence. Springer, New York. Wang, P., 2009. Formalization of evidence: a comparative study. J. Artif. Gen. Intell. 1, 25–53. Wang, P., 2013. Non-Axiomatic Logic: A Model of Intelligent Reasoning. World Scientific, Singapore. Zadeh, L.A., 1979. A theory of approximate reasoning. Mach. Intell. 9, 149–194.

Please cite this article as: Slam, N., et al., A framework with reasoning capabilities for crisis response decision–support systems. Eng. Appl. Artif. Intel. (2015), http://dx.doi.org/10.1016/j.engappai.2015.06.017i