International Journal of Approximate Reasoning xxx (2013) xxx–xxx
Contents lists available at SciVerse ScienceDirect
International Journal of Approximate Reasoning journal homepage: www.elsevier.com/locate/ijar
Multigranulation decision-theoretic rough sets Yuhua Qian a , Hu Zhang b , Yanli Sang b , Jiye Liang a,∗ a b
Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, Shanxi University, Taiyuan, 030006 Shanxi, China School of Computer and Information Technology, Shanxi University, Taiyuan, 030006 Shanxi, China
ARTICLE INFO
ABSTRACT
Article history: Available online xxx
The Bayesian decision-theoretic rough sets propose a framework for studying rough set approximations using probabilistic theory, which can interprete the parameters from existing forms of probabilistic approaches to rough sets. Exploring rough sets in the viewpoint of multigranulation is becoming one of desirable directions in rough set theory, in which lower/upper approximations are approximated by granular structures induced by multiple binary relations. Through combining these two ideas, the objective of this study is to develop a new multigranulation rough set model, called a multigranulation decision-theoretic rough set. Many existing multigranulation rough set models can be derived from the multigranulation decision-theoretic rough set framework. © 2013 Elsevier Inc. All rights reserved.
Keywords: Decision-theoretic rough sets Granular computing Multigranulation Bayesian decision theory
1. Introduction Rough set theory, originated by Pawlak [24,25], has become a well-established theory for uncertainty management in a wide variety of applications related to pattern recognition, image processing, feature selection, neural computing, conflict analysis, decision support, data mining and knowledge discovery [3,5,10,11,15,16,28–31,34,36,41,55]. In the past ten years, several extensions of the rough set model have been proposed in terms of various requirements, such as the decisiontheoretic rough set model (see [51]), the variable precision rough set (VPRS) model (see [56,58]), the rough set model based on tolerance relation (see [12–14]), the Bayesian rough set model (see [37]), the Dominance-based rough set model (see [4]), game-theoretic rough set model (see [6,7]), the fuzzy rough set model and the rough fuzzy set model (see [2]). Recently, the probabilistic rough sets have been paid close attentions [8,45,48,50,52]. A special issue on probabilistic rough sets was set up in International Journal of Approximate Reasoning, in which six relative papers were published [48]. Yao presented a new decision making method based on the decision-theoretic rough set, which is constructed by positive region, boundary region and negative region, respectively [52]. In the literature [50], the author further emphasized the superiority of three-way decisions in probabilistic rough set models. In fact, the probabilistic rough sets are developed based on the Bayesian decision principle, in which its parameters can be learned from a given decision table. Three-way decisions are most of superiorities of probabilistic rough set models. The decision-theoretic rough sets can derive various existing rough set models through setting the thresholds α and β . Since the decision-theoretic rough sets was proposed by Yao [49], it have attracted more and more concerns. Azam and Yao [1] proposed a threshold configuration mechanism for reducing the overall uncertainty of probabilistic regions in the probabilistic rough sets. Jia et al. [9] developed an optimization representation of decision-theoretic rough set model, and gave a heuristic approach and a particle swarm optimization approach for searching an attribute reduction with a minimum cost. Liu et al. [23] combined the logistic regression and the decision-theoretic rough set into a new classification approach, which can effectively reduce the misclassification rate. Yu et al. [53] applied decision-theoretic rough set model for automatically determining the number of clusters with much smaller time cost. ∗ Corresponding author. Tel./fax: +86 0351 7018176. E-mail addresses:
[email protected] (Y. Qian),
[email protected] (H. Zhang),
[email protected] (Y. Sang),
[email protected] (J. Liang). 0888-613X/$ - see front matter © 2013 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.ijar.2013.03.004
Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
2
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
In the view of granular computing (proposed by Zadeh [54]), in existing rough set models, a general concept described by a set is always characterized via the so-called upper and lower approximations under a single granulation, i.e., the concept is depicted by known knowledge induced from a single relation (such as equivalence relation, tolerance relation and reflexive relation) on the universe [17,18,51]. Conveniently, this kind of rough set models is called single granulation rough sets, just SGRS. In many circumstances, we often need to describe concurrently a target concept through multi binary relations according to a user’s requirements or targets of problem solving. Based on this consideration, Qian et al. [26–28] introduced multigranulation rough set theory (MGRS) to more widely apply rough set theory in practical applications, in which lower/upper approximations are approximated by granular structures induced by multi binary relations. From the viewpoint of rough set’s applications, the multigranulation rough set theory is very desirable in many real applications, such as multi-source data analysis, knowledge discovery from data with high dimensions and distributive information systems. Since the multigranulation rough set was proposed by Qian in 2006 [26], the theoretical framework have been largely enriched, and many extended multigranulation rough set models and relative properties and applications have also been proposed and studied [27–32]. Wu and Leung [39] proposed a formal approach to granular computing with multi-scale data measured at different levels of granulations, and studied theory and applications of granular labelled partitions in multi-scale decision information systems. Tripathy et al. [38] developed an incomplete multigranulation rough sets in the context of intuitionistic fuzzy rough sets and gave some important properties of the new rough set model. Raghavan and Tripathy [33] first researched the topological properties of multigranulation rough sets. Based on the idea of multigranulation rough sets, Xu et al. [42–44] developed a variable multigranulation rough set model, a fuzzy multigranulation rough set model and an ordered multigranulation rough set model. Wu [40] extended classical multigranualtion rough sets to the version based on a fuzzy relation, and proposed a new multigranulation fuzzy rough set (MGFRS). Zhang et al. [57] defined a variable precision multigranulation rough set, in which the optimistic multigranulation rough sets and the pessimistic one can be regarded as two extreme cases. Through introducing some membership parameters, this model becomes a multigranulation rough set with dynamic adaption according to practical acquirements. Yang et al. [46,47] examined the fuzzy multigranulation rough set theory, and revealed the hierarchical structure properties of the multigranulation rough sets. Liu and Miao [21,22] established a multigranulation rough set approach in covering contexts. Liang et al. [19] presented a kind of efficient feature selection algorithms for large scale data with a multigranulation strategy. She et al. [35] explored the topological structures and the properties of multigranulation rough sets. Lin et al. [20] gave a neighborhood multigranulation rough set model for multigranulation rough data analysis in the context of hybrid data. In the murigranulation rough set theory, each of various binary relation determines a corresponding information granulation, which largely impacts the commonality between each of the granulations and the fusion among all granulations. As one of very important rough set models, the decision-theoretic rough sets (DTRS) are still not be researched in the context of multigranulation, which limits its further applications in many problems, such as multi-source data analysis, knowledge discovery from data with high dimensions and distributive information systems. In what follows, besides those motivations mentioned in first multigranulation rough set paper (see Cases 1–3 in the literature [29]), we further emphasize the specific interest of multigranulation rough sets, which can be illustranted from the following three aspects.
• Multigranulation rough set theory is a kind of of information fusion strategies through single granulation rough sets. • •
Optimistic version and pessimistic version are only two simple methods in these information fusion approaches, which are used to easily introduce multigranulation ideas to rough set theory. In fact, there are some other fusion strategies [20,39–41,43]. For instance, in the literature [39], Xu et al. introduced a supporting characteristic function and a variable precision parameter β , called an information level, to investigate a target concept under majority granulations. With regard to some special information systems, such as multi-source information systems, distributive information systems and groups of intelligent agents, the classical rough sets can not be used to data mining from these information systems, but multigranulation rough sets can.
In this study, our objective is to develop a new multigranulation rough decision theory through combining the multigranulation idea and the Bayesian decisoin theory, called multigranulation decision-theoretic rough sets (MG-DTRS). We mainly give three common forms, the mean multigranulation decision-theoretic rough sets, the optimistic multigranulation decision-theoretic rough sets, and the pessimistic multigranulation decision-theoretic rough sets. The study is organized as follows. Some basic concepts in classical rough sets and multigranulation rough sets are briefly reviewed in Section 2. In Section 3, we first analyze the loss function and the entire decision risk in the context of multigranulation. Then, we propose three multigranulation decision-theoretic rough set forms that include the mean multigranulation decision-theoretic rough sets, the optimistic multigranulation decision-theoretic rough sets, and the pessimistic multigranulation decision-theoretic rough sets. When the thresholds have a special constraint, the multigranulation decision-theoretic rough sets will produce one of various variables of multigranulation rough sets. In Section 4, we establish the relationships among multigranulation decision-theoretic rough sets (MG-DTRS), other MGRS models, single granulation decisiontheoretic rough sets (SG-DTRS) and other SGRS models. Finally, Section 5 concludes this paper by bringing some remarks and discussions.
Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
3
2. Preliminary knowledge on rough sets In this section, we review some basic concepts such as information system, Pawlak’s rough set, and optimistic multigranulation rough set. Throughout this paper, we assume that the universe U is a finite non-empty set. 2.1. Pawlak’s rough set Formally, an information system can be considered as a pair I
= U , AT , where
• U is a non-empty finite set of objects, it is called the universe; • AT is a non-empty finite set of attributes, such that ∀a ∈ AT, Va is the domain of attribute a. ∀x ∈ U, we denote the value of x under the attribute a (a can be defined as IND(A)
∈ AT) by a(x). Given A ⊆ AT, an indiscernibility relation IND(A)
= {(x, y) ∈ U × U : a(x) = a(y), a ∈ A}.
(1)
The relation IND(A) is reflexive, symmetric and transitive, then IND(A) is an equivalence relation. By the indiscernibility relation IND(A), one can derive the lower and upper approximations of an arbitrary subset X of U. They are defined as A(X )
= {x ∈ U : [x]A ⊆ X } and A(X ) = {x ∈ U : [x]A ∩ X = ∅}
(2)
respectively, where [x]A = {y ∈ U : (x, y) ∈ IND(A)} is the A–equivalence class containing x. The pair [A(X ), A(X )] is referred to as the Pawlak’s rough set of X with respect to the set of attributes A. 2.2. Multigranulation rough sets The multigranulation rough set (MGRS) is different from Pawlak’s rough set model because the former is constructed on the basis of a family of indiscernibility relations instead of single indiscernibility relation. In optimistic multigranulation rough set approach, the word “optimistic” is used to express the idea that in multi independent granular structures, we need only at least one granular structure to satisfy with the inclusion condition between equivalence class and the approximated target. The upper approximation of optimistic multigranulation rough set is defined by the complement of the lower approximation. Definition 1 [32]. Let I be an information system in which A1 , A2 , . . . , Am ⊆ AT, then ∀X ⊆ U, the optimistic multigranum m O O i=1 Ai (X ) and i=1 Ai (X ), respectively,
lation lower and upper approximations are denoted by m
O
(X ) = {x ∈ U : [x]A1 ⊆ X ∨ [x]A2 ⊆ X ∨ · · · ∨ [x]Am ⊆ X };
Ai
i=1 m
O
(X ) =∼
Ai
i=1
where [x]Ai (1
m i=1
i=1
(∼X ) ;
(4)
≤ i ≤ m) is the equivalence class of x in terms of set of attributes Ai , and ∼X is the complement of X.
By the lower approximation ary region of X is
O BN m
O
Ai
Ai (X ) =
m i=1
m
i=1
O
Ai
(X ) −
Ai
m i=1
O
(X ) and upper approximation
i=1
m
i=1
Ai
O
(X ), the optimistic multigranulation bound-
O
Ai
(X ).
Proposition 1. Let I be an information system in which A1 , A2 , . . . , Am m
(3)
(5)
⊆ AT, then ∀X ⊆ U, we have
O
Ai
(X ) = {x ∈ U : [x]A1 ∩ X = ∅ ∧ [x]A2 ∩ X = ∅ ∧ · · · ∧ [x]Am ∩ X = ∅}.
(6)
Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
4
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
Proof. By Definition 1, we have
x
∈
m
O
m
(X ) ⇔ x ∈ /
Ai
i=1
O
(∼X )
Ai
i=1
⇔ [x]A1 (∼X ) ∧ [x]A2 (∼X ) ∧ · · · ∧ [x]Am (∼X ) ⇔ [x]A1 ∩ X = ∅ ∧ [x]A2 ∩ X = ∅ ∧ · · · ∧ [x]Am ∩ X = ∅.
From Proposition 1, it can be seen that though the optimistic multigranulation upper approximation is defined by the complement of the optimistic multigranulation lower approximation, it can also be considered as a set in which objects have non–empty intersection with the target in terms of each granular structure. Based on the SCED strategy, the following definition gives the formal representation of lower/upper approximation in the context of multi granular structures. Definition 2. Let I be an information system in which A1 , A2 , . . . , Am ⊆ AT, then ∀X ⊆ U, the pessimistic multigranulation m P P lower and upper approximations are denoted by m i=1 Ai (X ) and i=1 Ai (X ), respectively, m
P
(X ) = {x ∈ U : [x]A1 ⊆ X ∧ [x]A2 ⊆ X ∧ · · · ∧ [x]Am ⊆ X };
Ai
i=1 m
P
(X ) = ∼
Ai
i=1
m
P
Ai
i=1
(∼X ) .
By the lower approximation ary region of X is P BN m
i=1
Ai (X )
=
m
m
i=1
P
Ai
i=1
(X ) −
Ai
m
(8)
P
(X ) and upper approximation
m
i=1
i=1
Ai
P
(X ), the pessimistic multigranulation bound-
P
Ai
i=1
(X ).
Proposition 2. Let I be an information system in which A1 , A2 , . . . , Am m
(7)
(9)
⊆ AT, then ∀X ⊆ U, we have
P
Ai
(X ) = {x ∈ U : [x]A1 ∩ X = ∅ ∨ [x]A2 ∩ X = ∅ ∨ · · · ∨ [x]Am ∩ X = ∅}.
(10)
Proof. By Definition 2, we have
x
∈
m i=1
P
Ai
(X ) ⇔ x ∈ /
m i=1
P
Ai
(∼X )
⇔ [x]A1 (∼X ) ∨ [x]A2 (∼X ) ∨ · · · ∨ [x]Am (∼X ) ⇔ [x]A1 ∩ X = ∅ ∨ [x]A2 ∩ X = ∅ ∨ · · · ∨ [x]Am ∩ X = ∅.
Different from the upper approximation of optimistic multigranulation rough set, the upper approximation of pessimistic multigranulation rough set is represented as a set in which objects have non–empty intersection with the target in terms of at least one granular structure. 3. MG-DTRS: multigranulation decision-theoretic rough sets Probabilistic approaches to rough sets have many forms, such as the decision-theoretic rough set model (DTRS) [49,52,50], the variable precision rough set model [58], the Bayesian rough set model [37], and other related studies. Specially, the decision-theoretic rough sets proposed by Yao [49,50,52] has very strong theoretical basis and sound semantic interpretation. Through giving special thresholds, the decision-theoretic rough set model can degenerate into the classical Pawlak rough sets, the variable precision rough set, the 0.5-probabilistic rough set, and so on. In many real applications such as multi-source data analysis, knowledge discovery from data with high dimensions and distributive information systems, if one applies the decision-theoretic rough sets in these cases, the multigranulation version of DTRS will be very desirable. In this section, we will establish a multigranulation decision-theoretic rough set framework. Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
5
3.1. Decision-theoretic rough sets In this subsection, we briefly review some basic concepts in decision-theoretic rough sets. In the Bayesian decision procedure, a finite set of states can be written as = {ω1 , ω2 , . . . , ωs }, and a finite set of m possible actions can be denoted by A = {a1 , a2 , · · · , ar }. Let P (ωj |x) be the conditional probability of an object x being in state ωj given that the object is described by x. Let λ(ai |ωj ) denote the loss, or cost, for taking action ai when the state is ωj , the expected loss associated with taking action ai is given by R(ai |x)
=
s j=1
λ(ai |ωj )P (ωj |x).
In classical rough set theory, the approximation operators partition the universe into three disjoint classes POS (A), NEG(A), and BND(A). Through using the conditional probability P (X |[x]), the Bayesian decision precedure can decide how to assign x into these three disjoint regions [50,52]. The expected loss R(ai |[x]) associated with taking the individual actions can be expressed as R(a1 |[x])
= λ11 P (X |[x]) + λ12 P (X c |[x]),
R(a2 |[x])
= λ21 P (X |[x]) + λ22 P (X c |[x]),
R(a3 |[x])
= λ31 P (X |[x]) + λ32 P (X c |[x]),
where λi1 = λ(ai |X ), λi2 = λ(ai |X c ), and i = 1, 2, 3. When λ11 procedure leads to the following minimum-risk decision rules:
≤ λ31 < λ21 and λ22 ≤ λ32 < λ12 , the Bayesian decision
(P) If P (X |[x]) ≥ γ and P (X |[x]) ≥ α , decision POS (X ); (N) If P (X |[x]) ≤ β and P (X |[x]) ≤ γ , decision NEG(X ); (B) If β ≤ P (X |[x]) ≤ α , decide BND(X ); where
α=
λ12 − λ22 λ32 − λ22 λ12 − λ32 , γ = , β= . (λ31 − λ32 ) − (λ11 − λ12 ) (λ21 − λ22 ) − (λ11 − λ12 ) (λ21 − λ22 ) − (λ31 − λ32 )
If a loss function with λ11
≤ λ31 < λ21 and λ22 ≤ λ32 < λ12 further satisfies the condition: (λ12 − λ32 )(λ21 − λ31 ) ≥ (λ31 − λ11 )(λ32 − λ22 ), then α ≥ γ ≥ β . When α > β , we have α > γ > β . The decision-theoretic rough set has the decision rules: (P1) If P (X |[x]) ≥ α , decide POS (X ); (N1) If P (X |[x]) ≤ β , decide NEG(X ); (B1) If β < P (X |[x]) < α , decide BND(X ). Using these three decision rules, we get the probabilistic approximation: apr α (X ) = {x : P (X |[x]) ≥ α, x ∈ U }, apr β (X ) = {x : P (X |[x]) > β, x ∈ U }. When α = β , we have α = γ = β . The decision-theoretic rough set has the following decision rules: (P2) If P (X |[x]) (N2) If P (X |[x]) (B2) If P (X |[x])
> α , decide POS(X ); < α , decide NEG(X ); = α , decide BND(X ).
Using the above three decision rules, we get the probabilistic approximation: apr α (X )
apr α (X )
= {x : P (X |[x]) > α, x ∈ U }, = {x : P (X |[x]) ≥ α, x ∈ U }.
In the framework of decision-theoretic rough sets, the Pawlak rough set model, the variable precision rough set model, the Bayesian rough set model and the 0.5-probabilisitc rough set model can be pooled together and studied based on the notions of conditional functions. Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
6
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
3.2. Theoretical foundation in multigranulation decision-theoretic rough sets The multigranulation rough set (MGRS) is different from Pawlak’s rough set model because the former is constructed on the basis of a family of indiscernibility relations instead of single indiscernibility relation. Given R1 , R2 , . . . , Rm ⊆ R m granular structures and ∀X ⊆ U, the lower/upper approximation in a multigranulation rough set can be formally represented as two fusion functions, respectively, m i=1 m i=1
Ri (X )
= fl (R1 , R2 , . . . , Rm ),
Ri (X )
= fu (R1 , R2 , . . . , Rm ),
where fl is called a lower fusion function, and fu is called an upper fusion function. These two functions are used to compute the lower/upper approximation of a multigranulation rough set through fusing m granular structures. In practical applications of multigranulation rough sets, the fusion function has many forms according to various semantics and requirements. Conveniently, let λk (ai |ωj ) denote the loss, or cost, for taking action ai when the state is ωj by k-th granular structures. Let P (ωj |xk ) be the conditional probability of an object x being in state ωj given that the object is described by xk under k-th granular structures. The expected loss associated with taking action ai is given by R(ai |x1 , x2 , . . . , xm )
=
s m k=1 j=1
λk (ai |ωj )P (ωj |xk ).
(11)
The expected loss R(ai |x1 , x2 , . . . , xm ) is a conditional risk. τ (x1 , x2 , . . . , xm ) specifies which action to take, and its value is one of the actions a1 , a2 , . . . , ar . The overall risk R is the expected loss associated with the decision rule τ (x1 , x2 , . . . , xm ), the overall risk is defined by
R
=
x1 ,x2 ,...,xm
R(τ (x1 , x2 , . . . , xm )|x1 , x2 , . . . , xm )P(x1 , x2 , . . . , xm ),
(12)
where P(x1 , x2 , . . . , xm ) is a joint probability, which is calculated through fusing (P (x1 ), P (x2 ), . . . , P (xm )) induced by m granular structures induced by the same universe. Given multiple granular structures R1 , R2 , . . . , Rm ⊆ R, the multigranulation decision-theoretic rough sets aim to select a series of actions for which the overall risk is as small as possible, in which the actions include deciding positive region, deciding negative region and deciding boundary region. In the multigranulation decision-theoretic rough sets, there are two kinds of assumptions. One assumes that the values of λk (ai |ωj ), k ≤ m, are all equal each other, and the other assumes that they are not equivalent, in which each granular structure has its independent loss (or cost) functions itself. In order to introduce the idea of multigranulation decisiontheoretic rough sets, this paper only deals with the first assumption. Hence, the determined procedure of the parameters α , β and γ is consistent with that of classical decision-theoretic rough sets, and the value of each parameter in every granular structure is also equal each other. The multigranulation decision-theoretic rough sets for the second assumption will be established in future work. 3.3. Three cases of multigranulation decision-theoretic rough sets Given m granular structures R1 , R2 , . . . , Rm ⊆ R, when associated with taking action ai can be given by R(ai |x1 , x2 , . . . , xm )
=
s m k=1 j=1
λ(ai |ωj )P (ωj |xk ).
λk (ai |ωj ) = λl (ai |ωj ), k, l ∈ {1, 2, . . . , m}, the expected loss
(13)
In this case, the information fusion in multigranulation decision-theoretic rough sets can be simplified as the fusion of a set of probabilities under the same universe. In this subsection, we give three multigranulation decision-theoretic rough set models, which are a mean multigranulation decision-theoretic rough set (MMG-DTRS), an optimistic multigranulation decision-theoretic rough set (OMG-DTRS) and a pessimistic multigranulation decision-theoretic rough set (PMG-DTRS), respectively. Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
7
3.3.1. Mean multigranulation decision-theoretic rough sets In multigranulation decision-theoretic rough sets, when the loss function is fixed, judging the conditional probability of an object x within a target concept in m granular structures can be computed by its mathematic expectation. That is to say, E (P (X |x))
= P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m.
(14)
The joint probability is estimated by the mean value of m conditional probabilities. Based on this idea, hence we give a kind of multigranulation decision-theoretic rough set, called mean multigranulation decision-theoretic rough sets. Its formal definition is as follows. Definition 3. Given R1 , R2 , . . . , Rm m
approximations are denoted by
m
M,
α
Ri
i=1 m
M,
Ri
i=1
⊆ R m granular structures and ∀X ⊆ U, the mean multigranulation lower and upper M, β M, α (X ) and m (X ), respectively, i=1 Ri i=1 Ri
(X ) = {x : (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m ≥ α, x ∈ U };
(15)
(X ) = U − {x : (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m ≤ β, x ∈ U };
(16)
β
where [x]Ri (1 ≤ i ≤ m) is the equivalence class of x induced by Ri , P (X |[x]Ri ) is the conditional probability of the equivalent class [x]Ri with respect to X, and α , β are two probability constraints. By the lower approximation ary region of X is
M BN m
i=1
Ri (X ) =
m i=1
M,
Ri
m
i=1
Ri
β
(X ) −
M,
α
m i=1
(X ) and upper approximation M,
Ri
m
i=1
Ri
M,
β
(X ), the mean multigranulation bound-
α
(X ).
(17)
Similar to the classical decision-theoretic rough sets, when the thresholds tie-broke:
α > β , we can obtain the decision rules
(MP1) If (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m ≥ α , decide POS (X ); (MN1) If (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m ≤ β , decide NEG(X ); (MB1) If β < (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m < α , decide BND(X ). When α rules:
= β , we have α = γ = β . The mean multigranulation decision-theoretic rough set has the following decision
(MP2) If (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m (MN2) If (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m (MB2) If (P (X |[x]R1 ) + P (X |[x]R2 ) + · · · + P (X |[x]Rm ))/m
> α , decide POS(X ); < α , decide NEG(X ); = α , decide BND(X ).
3.3.2. Optimistic multigranulation decision-theoretic rough sets In existing optimistic multigranulation rough set approaches, the word “optimistic” is used to express the idea that in multi independent granular structures, its multigranulation lower approximation only needs at least one granular structure to satisfy with the inclusion condition between an equivalence class and the approximated target. While the upper approximation of an optimistic multigranulation rough set is defined by the complement of the lower approximation. Based on this idea, in this part, we develop an optimistic multigranulation decision-theoretic rough set. In this optimistic multigranulation decision-theoretic rough set, its lower approximation collects those objects in which each object has at least one granular structure satisfying the probability constraint (≥α ) between its equivalence class and the approximate target, while its upper approximation collects those objects in which each object has all granular structures satisfying the probability constraint (≤β ) between its equivalence class and the approximate target. Definition 4. Given R1 , R2 , . . . , Rm
⊆ R m granular structures and ∀X ⊆ U, the optimistic multigranulation lower and m O, β O, α (X ) and m (X ), respectively, i=1 Ri i=1 Ri
upper approximations are denoted by
Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
8
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx m
O,α
Ri
i=1 m
(X ) = {x : P (X |[x]R1 ) ≥ α ∨ P (X |[x]R2 ) ≥ α ∨ · · · ∨ P (X |[x]Rm ) ≥ α, x ∈ U };
(18)
(X ) = U − {x : P (X |[x]R1 ) ≤ β ∧ P (X |[x]R2 ) ≤ β ∧ · · · ∨ P (X |[x]Rm ) ≤ β, x ∈ U };
(19)
O,β
Ri
i=1
where [x]Ri (1 ≤ i ≤ m) is the equivalence class of x induced by Ri , P (X |[x]Ri ) is the conditional probability of the equivalent class [x]Ri with respect to X, and α , β are two probability constraints. By the lower approximation boundary region of X is
O BN m
i=1
Ri (X ) =
m
O,
m
Ri
i=1
β
m
(X ) −
Ri
i=1
O,
α
(X ) and upper approximation
O,
Ri
i=1
m
i=1
Ri
O,
β
(X ), the optimistic multigranulation
α
(X ).
(20)
From the definition of optimistic multigranulation decision-theoretic rough sets, one can obtain the following three propositions. Proposition 3. Given R1 , R2 , . . . , Rm (1)
m
(2)
m
i=1
Ri
i=1
Ri
O,
α
O,
β
where Ri α (X )
⊆ R m granular structures and ∀X ⊆ U. Then, the following properties hold
(X ) ⊇ Ri α (X ), i ≤ m; (X ) ⊆ Ri β (X ), i ≤ m;
= {x : P (X |[x]Ri ) ≥ α, x ∈ U }, and Ri β (X ) = {x : P (X |[x]Ri ) ≥ β, x ∈ U }.
Proposition 4. Given R1 , R2 , . . . , Rm (1)
m
(2)
m
i=1
Ri
i=1
Ri
O,
α
O,
β
where Ri α (X )
(X ) = (X ) =
m
i=1
Ri α (X );
i=1
Ri β (X );
m
= {x : P (X |[x]Ri ) ≥ α, x ∈ U }, and Ri β (X ) = {x : P (X |[x]Ri ) ≥ β, x ∈ U }.
Proposition 5. Given R1 , R2 , . . . , Rm (1)
m
(2)
m
i=1
Ri
i=1
Ri
⊆ R m granular structures and ∀X ⊆ U. Then, the following properties hold
O,
α
O,
β
(X1 ) ⊆ (X1 ) ⊆
m
i=1
Ri
i=1
Ri
m
O,
α
O,
β
⊆ R m granular structures and ∀X1 ⊆ X2 ⊆ U. Then, the following properties hold (X2 ); (X2 ).
Similar to the classical decision-theoretic rough sets, when the thresholds tie-broke:
α > β , we can obtain the decision rules
(OP1) If ∃i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) ≥ α , decide POS (X ); (ON1) If ∀i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) ≤ β , decide NEG(X ); (OB1) Otherwise, decide BND(X ). When α = decision rules:
β , we have α = γ = β . The optimisitc multigranulation decision-theoretic rough set has the following
(OP2) If ∃i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) > α , decide POS (X ); (ON2) If ∀i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) < α , decide NEG(X ); (OB2) Otherwise, decide BND(X ). 3.3.3. Pessimistic multigranulation decision-theoretic rough sets In decision making analysis, “Seeking common ground while eliminating differences" (SCED) is one of usual decision strategies. This strategy argues that one reserves common decisions while deleting inconsistent decisions, which can be seen as a conservative decision strategy. Based on this consideration, Qian et al. [31] proposed a so-called pessimistic Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
9
multigranulation rough set. In this subsection, we will combine pessimistic multigranulation rough set and decision-theoretic rough set into an entire decision framework together. In the pessimistic multigranulation decision-theoretic rough sets, its lower approximation collects those objects in which its equivalence class from all granular structures satisfying the probability constraint (≥α ) between its equivalence class and the approximate target, while its upper approximation collects those objects in which each object has at least one granular structure satisfying the probability constraint (≤β ) between its equivalence class and the approximate target. Definition 5. Given R1 , R2 , . . . , Rm
⊆ R m granular structures and ∀X ⊆ U, the pessimistic multigranulation lower and m P, β P, α (X ) and m (X ), respectively, i=1 Ri i=1 Ri
upper approximations are denoted by
m
P,
α
Ri
i=1 m
P,
(X ) = {x : P (X |[x]R1 ) ≥ α ∧ P (X |[x]R2 ) ≥ α ∧ · · · ∧ P (X |[x]Rm ) ≥ α, x ∈ U };
(21)
(X ) = U − {x : P (X |[x]R1 ) ≤ β ∨ P (X |[x]R2 ) ≤ β ∨ · · · ∨ P (X |[x]Rm ) ≤ β, x ∈ U };
(22)
β
Ri
i=1
where [x]Ri (1 ≤ i ≤ m) is the equivalence class of x induced by Ri , P (X |[x]Ri ) is the conditional probability of the equivalent class [x]Ri with respect to X, and α , β are two probability constraints. By the lower approximation boundary region of X is
P BN m
i=1
Ri (X ) =
m
P,
m
i=1
β
(X ) −
Ri
i=1
Ai
P,
m
α
(X ) and upper approximation
P,
Ri
i=1
m
i=1
Ai
P,
β
(X ), the pessimistic multigranulation
α
(X ).
(23)
From the definition of pessimistic multigranulation decision-theoretic rough set, the following three propositions can be easily induced. Proposition 6. Given R1 , R2 , . . . , Rm (1)
m
(2)
m
i=1
Ri
i=1
Ri
P,
α
P,
β
where Ri α (X )
(X ) ⊆ Ri α (X ), i ≤ m; (X ) ⊇ Ri β (X ), i ≤ m;
= {x : P (X |[x]Ri ) ≥ α, x ∈ U }, and Ri β (X ) = {x : P (X |[x]Ri ) ≥ β, x ∈ U }.
Proposition 7. Given R1 , R2 , . . . , Rm (1)
m
(2)
m
i=1
Ri
i=1
Ri
P,
α
P,
β
where Ri α (X )
(X ) = (X ) =
m
i=1
Ri α (X );
i=1
Ri β (X );
m
m
(2)
m
i=1
Ri
i=1
Ri
P, P,
⊆ R m granular structures and ∀X ⊆ U. Then, the following properties hold
= {x : P (X |[x]Ri ) ≥ α, x ∈ U }, and Ri β (X ) = {x : P (X |[x]Ri ) ≥ β, x ∈ U }.
Proposition 8. Given R1 , R2 , . . . , Rm (1)
⊆ R m granular structures and ∀X ⊆ U. Then, the following properties hold
α β
(X1 ) ⊆ (X1 ) ⊆
m
i=1
Ri
i=1
Ri
m
P,
α
P,
β
⊆ R m granular structures and ∀X1 ⊆ X2 ⊆ U. Then, the following properties hold
(X2 ); (X2 ).
Similar to the classical decision-theoretic rough sets, when the thresholds tie-broke:
α > β , we can obtain the decision rules
(PP1) If ∀i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) ≥ α , decide POS (X ); (PN1) If ∃i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) ≤ β , decide NEG(X ); (PB1) Otherwise, decide BND(X ). Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
10
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
When α = decision rules:
β , we have α = γ = β . The pessimistic multigranulation decision-theoretic rough set has the following
(PP2) If ∀i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) > α , decide POS (X ); (PN2) If ∃i ∈ {1, 2, . . . , m} such that P (X |[x]Ri ) < α , decide NEG(X ); (PB2) Otherwise, decide BND(X ). The following proposition establishes the relationships among the mean multigranulation decision-theoretic rough sets, the optimistic multigranulation decision-theoretic rough sets, and the pessimistic multigranulation decision-theoretic rough sets. Proposition 9. Given R1 , R2 , . . . , Rm (1)
m
(2)
m
i=1
Ri
i=1
Ri
P,
α
P,
β
(X ) ⊆ (X ) ⊇
m
i=1
Ri
i=1
Ri
m
M,
α
M,
β
⊆ R m granular structures and ∀X ⊆ U. Then, the following properties hold
(X ) ⊆ (X ) ⊇
m
i=1
Ri
i=1
Ri
m
O,
α
O,
β
(X ); (X ).
4. Relationships between MG-DTRS and other MGRS models 4.1. Classical multigranulation rough sets In the decision-theoretic rough sets, the probability value, the thresholds α and β decide its detailed form of rough sets. From Yao’s work [50,52], it follows that when α = 1 and β = 0, the decision-theoretic rough sets will degenerate into the standard rough sets. In this case, we have that
|[x]R ∩ X | = 1 ⇔ [x]R ⊆ X , |[x]R | |[x]R ∩ X | = 0 ⇔ [x]R ∩ X = Ø. P (X |[x]R ) = |[x]R | P (X |[x]R )
=
Hence, m
O,
α
(X ) = {x : P (X |[x]R1 ) ≥ 1 ∨ P (X |[x]R2 ) ≥ 1 ∨ · · · ∨ P (X |[x]Rm ) ≥ 1, x ∈ U }
Ri
i=1
⇒
m
O,
α
(X ) = {x : [x]R1 ⊆ X ∨ [x]R2 ⊆ X ∨ · · · ∨ [x]Rm ⊆ X , x ∈ U },
Ri
i=1
and m
O,
β
(X ) = U − {x : P (X |[x]R1 ) ≤ 0 ∧ P (X |[x]R2 ) ≤ 0 ∧ · · · ∧ P (X |[x]Rm ) ≤ 0, x ∈ U }
Ri
i=1
⇒
m
O,
β
(X ) = {x : [x]R1 ∩ X = Ø ∧ [x]R2 ∩ X = Ø ∧ · · · ∧ [x]Rm ∩ X = Ø, x ∈ U }.
Ri
i=1
The multigranulation lower approximation and multigranulation upper approximation are consistent with those in classical optimistic multigranulation rough sets (OMGRS) [32]. Hence, when α = 1 and β = 0, the optimistic multigranulation decision-theoretic rough sets (OMG-DTRS) will degenerate into the optimistic multigranulation rough sets (OMGRS). Similarly, one has that m
P,
α
(X ) = {x : P (X |[x]R1 ) ≥ 1 ∧ P (X |[x]R2 ) ≥ 1 ∧ · · · ∧ P (X |[x]Rm ) ≥ 1, x ∈ U }
Ri
i=1
⇒
m i=1
P,
Ri
α
(X ) = {x : [x]R1 ⊆ X ∧ [x]R2 ⊆ X ∧ · · · ∧ [x]Rm ⊆ X , x ∈ U },
Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
11
and m
P,
β
(X ) = U − {x : P (X |[x]R1 ) ≤ 0 ∨ P (X |[x]R2 ) ≤ 0 ∨ · · · ∨ P (X |[x]Rm ) ≤ 0, x ∈ U }
Ri
i=1
⇒
m
P,
β
(X ) = {x : [x]R1 ∩ X = Ø ∨ [x]R2 ∩ X = Ø ∨ · · · ∨ [x]Rm ∩ X = Ø, x ∈ U }.
Ri
i=1
The multigranulation lower approximation and multigranulation upper approximation are equivalent to those in classical pessimistic multigranulation rough sets (PMGRS) [31]. Thus, pessimistic multigranulation decision-theoretic rough sets (PMG-DTRS) will degenerate into the pessimistic multigranulation rough sets (PMGRS). 4.2. Variable multigranulation rough sets When α + β = 1 and 0 ≤ β sets. The condition 0 ≤ β ≤ 0.5 Hence, m
O,
α
(X ) = {x : P (X |[x]R1 ) ≥ α ∨ P (X |[x]R2 ) ≥ α ∨ · · · ∨ P (X |[x]Rm ) ≥ α, x ∈ U },
Ri
i=1 m
O,
⇒
m
β
(X ) = U − {x : P (X |[x]R1 ) ≤ β ∧ P (X |[x]R2 ) ≤ β ∧ · · · ∧ P (X |[x]Rm ) ≤ β, x ∈ U },
Ri
i=1
O,
α
(X ) = {x : P (X |[x]R1 ) ≥ α ∨ P (X |[x]R2 ) ≥ α ∨ · · · ∨ P (X |[x]Rm ) ≥ α, x ∈ U },
Ri
i=1 m
≤ 0.5 < α ≤ 1, the decision-theoretic rough sets become the variable precision rough < α ≤ 1 follows that the lower approximation is a subset of the upper approximation.
O,
β
(X ) = {x : P (X |[x]R1 ) > 1 − α ∧ P (X |[x]R2 ) > 1 − α ∧ · · · ∧ P (X |[x]Rm ) > 1 − α, x ∈ U }.
Ri
i=1
The multigranulation lower approximation and multigranulation upper approximation are consistent with those in the optimistic variable precision multigranulation rough sets (OVMGRS) proposed by [57]. Hence, when α + β = 1 and 0 ≤ β ≤ 0.5 < α ≤ 1, OMG-DTRS will degenerate into the optimistic variable precision multigranulation rough sets (OVMGRS). Similarly, the pessimistic multigranulation decision-theoretic rough sets have the following properties. m
P,
α
(X ) = {x : P (X |[x]R1 ) ≥ α ∧ P (X |[x]R2 ) ≥ α ∧ · · · ∧ P (X |[x]Rm ) ≥ α, x ∈ U },
Ri
i=1 m
P,
m
(X ) = U − {x : P (X |[x]R1 ) ≤ β ∨ P (X |[x]R2 ) ≤ β ∨ · · · ∨ P (X |[x]Rm ) ≤ β, x ∈ U },
Ri
i=1
⇒
β
P,
α
(X ) = {x : P (X |[x]R1 ) ≥ α ∧ P (X |[x]R2 ) ≥ α ∧ · · · ∧ P (X |[x]Rm ) ≥ α, x ∈ U },
Ri
i=1 m i=1
P,
Ri
β
(X ) = {x : P (X |[x]R1 ) > 1 − α ∨ P (X |[x]R2 ) > 1 − α ∨ · · · ∨ P (X |[x]Rm ) > 1 − α, x ∈ U }.
The multigranulation lower approximation and multigranulation upper approximation are equivalent to those in the pessimistic variable precision multigranulation rough sets (PVMGRS) developed by Zhang et al. [57]. Thus, PMG-DTRS will degenerate into the pessimistic variable precision multigranulation rough sets (PVMGRS). When the thresholds α and β have other constrain relationships, the multigranulation decision-theoretic rough sets will produce various variables of multigranulation rough sets, which can be applied in many practical applications. Based on the above discussions, we can obtain the relationships among MG-DTRS, other MGRS models, SG-DTRS and other SGRS models, which is shown as Fig. 1. In this figure, MG-PRS means a multigranulation probabilistic rough set, MGVRS is a multigranulation variable precision rough set, and MG-0.5PRS means a multigranulation 0.5-probabilistic rough set, Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
12
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
(a) Relationship between MG-DTRS and other MGRS models
(b) Relationship between MG-DTRS and SGDTRS
Fig. 1. Relationships among MG-DTRS, other MGRS models, SG-DTRS and other SGRS models.
while SG-PRS means a MG-PRS means a multigranulation probabilistic rough set, probabilistic rough set, SG-VRS is a single granulation variable precision rough set, and MG-0.5PRS means a single granulation 0.5-probabilistic rough set, respectively. 5. Conclusions Multigranulation rough set theory (MGRS) is one of desirable directions in rough set theory, in which lower/upper approximations are approximated by granular structures induced by multiple binary relations. It provides a new perspective for decision making analysis based on the rough set theory. In this paper, we have first proposed a new multigranulation rough set model through combining MGRS and the decision-theoretic rough sets together, called a multigranulation decisiontheoretic rough set model. In this framework, we have given three forms of MG-DTRS, which are mean multigranulation decision-theoretic rough sets, optimistic multigranulation decision-theoretic rough sets, and pessimistic multigranulation decision-theoretic rough sets. These forms of MG-DTRS can derive many existing multigranulation rough set models when the parameters satisfy special constraints. Finally, we have also established the relationships among multigranulation decisiontheoretic rough sets, multigranulation rough sets and single granulation rough sets. This study only develops a framework of multigranulation decision-theoretic rough sets, in which there are still many interesting issues to be explored. Its future direction has four aspects: (1) model extension of multigranulation decisiontheoretic rough sets in other types of data sets; (2) information fusion based on multiple granular structures; (3) information granule selection and granulation selection; and (4) applications of multigranulation decision-theoretic rough sets. It is deserved to point out that the multigranulation decision-theoretic rough set and standard decision-theoretic rough set can be combined to data mining and decision making in real applications, such as multi-source information systems, data with high dimensions, distributive information systems. Acknowledgment This work was partially supported by SRG: 7002681 of City University of Hong Kong, supported by National Natural Science Fund of China (Nos. 71031006, 61202018), National Key Basic Research and Development Program of China(973) (No. 2013CB329404), Program for New Century Excellent Talents in University, the Research Fund for the Doctoral Program of Higher Education (20121401110013), the Research Project Supported by Shanxi Scholarship Council of China (No. 201008), and Supported by Program for the Innovative Talents of Higher Learning Institutions of Shanxi, China (20120301). References [1] N. Azam, J.T. Yao, Analyzing uncertainties of probabilistic rough set regions with game-theoretic rough sets, International Journal of Approximate Reasoning, in press. [2] D. Dubois, H. Prade, Rough fuzzy sets and fuzzy rough sets, International Journal of General Systems 17 (1990) 191–209. [3] I. Düntsch, G. Gediga, Uncertainty measures of rough set prediction, Artificial Intelligence 106 (1998) 109–137. [4] T.F. Fan, C.J. Liau, D.R. Liu, Dominance-based fuzzy rough set analysis of uncertain and possibilistic data tables, International Journal of Approximate Reasoning 52 (9) (2011) 1283–1297. [5] T. Feng, S.P. Zhang, J.S. Mi, The reduction and fusion of fuzzy covering systems based on the evidence theory, International Journal of Approximate Reasoning 53 (1) (2012) 87–103. [6] J.P. Herbert, J.T. Yao, Game-theoretic rough sets, Fundamenta Informaticae 108 (3-4) (2011) 267–286. [7] J.P. Herbert, J.T. Yao, Analysis of data-driven paramenters in game-theoretic rough sets, in: Proceedings of International Conference on Rough Sets and Knowledge Technology, Banff, Canada, LNCS 6954, 2011, pp. 447–456. [8] X.Y. Jia, W.W. Li, L. Shang, J.J. Chen, An t model, in: Proceedings of International Conference on Rough Sets and Knowledge Technology, Banff, Canada, LNCS 6954, 2011, pp. 457–465. [9] X.Y. Jia, Z.M. Tang, W.H. Liao, L. Shang, On an optimization representation of decision-theoretic rough set model, International Journal of Approximate Reasoning (2013), In press. [10] R. Jensen, Q. Shen, Fuzzy-rough sets assisted attribute selection, IEEE Transactions on Fuzzy Systems 15 (1) (2007) 73–89. [11] G. Jeon, D. Kim, J. Jeong, Rough Sets Attributes Reduction Based Expert System in Interlaced Video Sequences, IEEE Transactions on Consumer Electronics 52 (4) (2006) 1348–1355. [12] M. Kryszkiewicz, Rough set approach to incomplete information systems, Information Sciences 112 (1998) 39–49. [13] M. Kryszkiewicz, Rules in incomplete information systems, Information Sciences 113 (1999) 271–292. [14] Y. Leung, D.Y. Li, Maximal consistent block technique for rule acquisition in incomplete information systems, Information Sciences 153 (2003) 85–106.
Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004
Y. Qian et al. / International Journal of Approximate Reasoning xxx (2013) xxx–xxx
13
[15] H.X. Li, M.H. Wang, X.Z. Zhou, j.B. Zhao, An interval set model for learning rules from incomplete information table, International Journal of Approximate Reasoning 53 (1) (2012) 24–37. [16] J.Y. Liang, C.Y. Dang, K.S. Chin, C.M. Yam Richard, A new method for measuring uncertainty and fuzziness in rough set theory, International Journal of General Systems 31 (4) (2002) 331–342. [17] J.Y. Liang, D.Y. Li, Uncertainty and Knowledge Acquisition in Information Systems, Science Press, Beijing, China, 2005. [18] J.Y. Liang, Y.H. Qian, Information granules and entropy theory in information systems, Science in China-Series F: Information Sciences 51 (10) (2008) 1427–1444. [19] J.Y. Liang, F. Wang, C.Y. Dang, Y.H. Qian, An efficient rough feature selection algorithm with a multi-granulation view, International Journal of Approximate Reasoning 53 (2012) 912–926. [20] G.P. Lin, Y.H. Qian, J.J. Li, NMGRS: neighborhood-based multigranulation rough set, International Journal of Approximate Reasoning 53 (2012) 1080–1093. [21] C.H. Liu, M.Z. Wang, Covering fuzzy rough set based on multi-granulations, in: International Conference on Uncertainty Reasoning and Knowledge Engineering, 2011, pp. 146–149. [22] C.H. Liu, D.Q. Miao, On Covering multigranulation fuzzy rough sets, Information Technology Journal (2012) doi: 10.3923/itj.2012. [23] D. Liu, T.R. Li, D.C. Liang, Incorporating logistic regression to decision-theoretic rough sets for classification, International Journal of Approximate Reasoning (2013), In press. [24] Z. Pawlak, Rough Sets: Theoretical Aspects of Reasoning about Data, System Theory, Kluwer, Dordrecht, 1991. [25] Z. Pawlak, A. Skowron, Rudiments of rough sets, Information Sciences 177 (2007) 3–27. [26] Y.H. Qian, J.Y. Liang, Rough set method based on multi-granulations, in: Proceedings of 5th IEEE Conference on Cognitive Informatics, vol. I, 2006, pp. 297–304. [27] Y.H. Qian, C.Y. Dang, J.Y. Liang, MGRS in incomplete information systems, in: Proceedings of 2007 IEEE Conference on Granular Computing, 2007, pp. 163–168. [28] Y.H. Qian, J.Y. Liang, C.Y. Dang, Incomplete multigranulation rough set, IEEE Transactions on Systems, Man and Cybernetics – Part A 20 (2010) 420–431. [29] Y.H. Qian, J.Y. Liang, D.Y. Li, F. Wang, N.N. Ma, Approximation reduction in inconsistent incomplete decision tables, Knowledge-Based Systems 23 (2010) 427–433. [30] Y.H. Qian, J.Y. Liang, W. Pedrycz, C.Y. Dang, Positive approximation: an accelerator for attribute reduction in rough set theory, Artificial Intelligence 174 (2010) 597–618. [31] Y.H. Qian, J.Y. Liang, W. Wei, Pessimistic rough decision, in: Second International Workshop on Rough Sets Theory, 19–21, October 2010, Zhoushan, PR China, pp. 440–449. [32] Y.H. Qian, J.Y. Liang, Y.Y. Yao, C.Y. Dang, MGRS: a multigranulation rough set, Information Sciences 180 (2010) 949–970. [33] R. Raghavan, B.K. Tripathy, On some topological properties of multigranular rough sets, Advances in Applied Science Research 2 (3) (2011) 536–543. [34] M. Rebolledo, Rough intervals-enhancing intervals f or qualitative modeling of technical systems, Artificial Intelligence 170 (2006) 667–685. [35] Y.H. She, X.L. He, On the structure of the multigranulation rough set model, Knowledge-Based Systems, (36) (2012) 81–92. [36] Q. Shen, A. Chouchoulas, A rough-fuzzy approach for generating classification rules, Pattern Recognition (35) (2002) 2425–2438. ´ [37] D. Slezak, W. Ziarko, The investigation of the Bayesian rough set model, International Journal of Approximate Reasoning 40 (2005) 81–91. [38] B.K. Tripathy, G.K. Panda, A. Mitra, Incomplete multigranulation based on rough intuitionistic fuzzy sets, UNIASCIT 2 (1) (2012) 118–124. [39] W.Z. Wu, Y. Leung, Theory and applications of granular labeled partitions in multi-scale decision tables, Information Sciences 181 (2011) 3878–3897. [40] M.F. Wu, Fuzzy rough set model based on multi-granulations, 2010 International Conference on Computer Engineering and Technology, 2010, pp. 271–275. [41] Z.B. Xu, J.Y. Liang, C.Y. Dang, K.S. Chin, Inclusion degree: a perspective on measures for rough set data analysis, Informatin Sciences 141 (2002) 227–236. [42] W.H. Xu, X.T. Zhang, Q.R. Wang, A generalized multi-granulation rough set approach, Lecture Notes in Computer Science 6840 (2012) 681–689. [43] W.H. Xu, Q.R. Wang, X.T. Zhang, Multi-granulation fuzzy rough sets in a fuzzy tolerance approximation space, International Journal of Fuzzy Systems 13 (4) (2011) 246–259. [44] W.H. Xu, W.X. Sun, X.Y. Zhang, W.X. Zhang, Multiple granulation rough set approach to ordered information systems, International Journal of General Systems 41 (5) (2012) 475–501. [45] X.P. Yang, H.G. Song, T.J. Li, Decision making in incomplete information systems based on decision-theoretic rough sets, in: Proceedings of International Conference on Rough Sets and Knowledge Technology, Banff, Canada, LNCS 6954, 2011, pp. 495–503. [46] X.B. Yang, X.N. Song, H.L. Dou, J.Y. Yang, Multi-granulation rough set: from crisp to fuzzy case, Annals of Fuzzy Mathematics and Informatics 1 (1) (2011) 55–70. [47] X.B. Yang, Y.H. Qian, J.Y. Yang, Hierarchical structures on multigranulation spaces, Journal of Computer Science and Technology (2012), In press. [48] J.T. Yao, J.T. Yao, Y.Y. Yao, W. Ziarko, Probabilistic rough sets: approximations, decision-makings, and applications, International Journal of Approximate Reasoning 49 (2008) 253–254. [49] Y.Y. Yao, S.K.M. Wong, P. Lingras, A decision-theoretic rough set model, in: Z.W. Ras, M. Zemankova, M.L. Emrich (Eds.), Methodologies for Intelligent Systems 5, North-Holland, New York, 1990, pp. 17–24. [50] Y.Y. Yao, The superiority of three-way decisions in probabilistic rough set models, Information Sciences 181 (2011) 1080–1096. [51] Y.Y. Yao, Information granulation and rough set approximation, International Journal of Intelligent Systems 16 (2001) 87–104. [52] Y.Y. Yao, Three-way decisions with probabilistic rough sets, Information Sciences 180 (2010) 341–353. [53] H. Yu, Z.G. Liu, G.Y. Wang, An automatic method to determine the number of clusters using decision-theoretic rough set, International Journal of Approximate Reasoning (2013), In press. [54] L.A. Zadeh, Toward a theory of fuzzy information granulation and its centrality in human reasoning and fuzzy logic, Fuzzy Sets and Systems 90 (1997) 111–127. [55] A. Zeng, D. Pan, Q.L. Zheng, H. Peng, Knowledge acquisition based on rough set theory and principal component analysis, IEEE Intelligent Systems (2006) 78–85. [56] X.Y. Zhang, Z.W. Mo, F. Xiong, W. Cheng, Comparative study of variable precision rough set model and graded rough set model, International Journal of Approximate Reasoning 53 (1) (2012) 104–116. [57] M. Zhang, Z.M. Tang, W.Y. Xu, X.B. Yang, A variable muitlgranulation rough sets approach, in: Proceedings of the 7th international conference on Intelligent Computing: bio-inspired computing and applications, 2011, pp. 315–322. [58] W. Ziarko, Variable precision rough sets model, Journal of Computer System Science 46 (1) (1993) 39–59.
Please cite this article in press as: Y. Qian et al., Multigranulation decision-theoretic rough sets, Int. J. Approx. Reason (2013), http://dx.doi.org/10.1016/j.ijar.2013.03.004