Available online at www.sciencedirect.com
Information Sciences 178 (2008) 2389–2395 www.elsevier.com/locate/ins
New measures of weighted fuzzy entropy and their applications for the study of maximum weighted fuzzy entropy principle Om Parkash a, P.K. Sharma b,*, Renuka Mahajan c a
Department of Mathematics, Guru Nanak Dev University, Amritsar 143 005, India b Department of Mathematics, Hindu College, Amritsar 143 001, India c Department of Mathematics, Rama Chopra College, Pathankot, India
Received 26 June 2007; received in revised form 31 October 2007; accepted 3 December 2007
Abstract Keeping in view the non-probabilistic nature of experiments, two new measures of weighted fuzzy entropy have been introduced and to check their authenticity, the essential properties of these measures have been studied. Under the fact that measures of entropy can be used for the study of optimization principles when certain partial information is available, we have applied the existing as well as the newly introduced weighted measures of fuzzy entropy to study the maximum entropy principle. 2007 Elsevier Inc. All rights reserved. Keywords: Fuzzy set; Fuzzy entropy; Weighted fuzzy entropy; Membership function; Maximum fuzzy entropy principle
1. Introduction When proposing fuzzy sets, Zadeh’s [18] concerns were explicitly centered on their potential contribution in the domains of pattern classification, processing and communication of information, abstraction and summarization. Although the claims that fuzzy sets were relevant in these areas appeared unsustained at the time when they were first uttered, namely in the early sixties, the future development of information sciences and engineering proved that these intuitions were right, beyond all expectations. The specificity of fuzzy sets is to capture the idea of partial membership. The characteristic function of a fuzzy set, often called membership function, the role of which has well been explained by Singpurwalla and Booker [16] in probability measures of fuzzy sets. A generalized theory of uncertainty has been well explained by Zadeh [20] where he remarked that uncertainty is an attribute of information and the path-breaking work of Shannon [15] has led to a universal *
Corresponding author. Tel.: +91 0183 2425349. E-mail addresses: omparkash@rediffmail.com (O. Parkash), pk_sharma7@rediffmail.com (P.K. Sharma).
0020-0255/$ - see front matter 2007 Elsevier Inc. All rights reserved. doi:10.1016/j.ins.2007.12.003
2390
O. Parkash et al. / Information Sciences 178 (2008) 2389–2395
acceptance of the theory that information is statistical in nature. A perception-based theory of probabilistic reasoning with imprecise probabilities has also been explained by Zadeh [19]. Some work related with uncertainty management for intelligence analysis is reported by Yager [17] whereas the generalized information theory, its aims, results and some open problems are discussed by Klir [10]. Chen [2] has remarked that Shannon’s [15] mathematical theory of information entropy was introduced to analyze the information carrying capacity of communication channels, serving as a measure of the degree of uncertainty or the extent of ignorance. Some work related with probabilistic differential entropy has been done by Garbaczewski [4] whereas Nanda and Paul [11] have redefined Khinchin’s version of entropy and discussed the properties of the aging classes based on their generalized entropy. Taking into consideration, the concept of fuzzy sets, De Luca and Termini [3] suggested that corresponding to Shannon’s [15] probabilistic entropy, the measure of fuzzy entropy should be n X H ðAÞ ¼ ½lA ðxi Þ log lA ðxi Þ þ ð1 lA ðxi ÞÞ logð1 lA ðxi ÞÞ ð1:1Þ i¼1
where lA(xi) are the fuzzy values. Parkash [12] introduced a new generalized measure of fuzzy entropy involving two real parameters and studied its many interesting properties. Parkash and Sharma [13] developed some measures of fuzzy entropy and obtained relationships among these measures whereas applications of these measures to coding theory have been provided by Parkash and Sharma [14]. Guo and Xin [5] have extended Zadeh’s [18] idea to study some new generalized entropy formulas for fuzzy sets. In 1957, Jaynes [7] proposed a rule to assign numerical values to probabilities in circumstances where certain partial information is available. Today this rule, known as Maximum Entropy Principle, is used in many fields, ranging from physical, biological and social sciences to stock market analysis. Kapur and Kesavan [9] have studied the use of extended MaxEnt in all these applications. They have also studied the use of measures of entropy of a proportions distribution as a measure of uncertainty, equality, spread out, diversity, interactivity, flexibility, system complexity, disorder, dependence and similarity. Generalized measures of entropy and cross entropy can be used for the study of optimization principles. Reliability theory, marketing, measurement of risk in portfolio analysis and quality control are some of the areas where these generalized optimization principles can be successfully applied. Herniter [6] used Shannon’s [15] measure in studying the market behavior and found an anomalous result which was later overcome on using parametric measure of entropy. Similarly, various other methods can be explained if we use generalized measures of entropy and directed divergence. The study of maximum entropy principle undertaken by Herniter [6] and Kapur and Kesavan [9] is totally probabilistic in nature. But, there are situations where probabilistic measures of entropy do not work and thus, we explore the possibility of fuzzy measures to extend their scope of applications. Keeping this idea in mind, and also the concept of weighted entropy introduced by Belis and Guiasu [1], we have developed two new measures of weighted fuzzy entropy, the findings of which have been applied to study the principle of maximum weighted fuzzy entropy. In the next section, we propose two new weighted measures of fuzzy entropy. 2. Two new weighted measures of fuzzy entropy The following weighted measures of fuzzy entropy have been introduced n X plA ðxi Þ pð1 lA ðxi ÞÞ 1 ðIÞ H ðA; W Þ ¼ þ sin 1 wi sin 2 2 i¼1 2
ðIIÞ H ðA; W Þ ¼
n X i¼1
pl ðxi Þ pð1 lA ðxi ÞÞ þ cos 1 wi cos A 2 2
ð2:1Þ
ð2:2Þ
First of all, we check the validity of the weighted measures proposed in (2.1) and (2.2). For this purpose, we have the following properties:
O. Parkash et al. / Information Sciences 178 (2008) 2389–2395
2391
(1) H2 11ðA; W Þ P0 P h i n H ðA;W Þ plA ðxi Þ pð1lA ðxi ÞÞ p 2 (2) o ol ¼ w sin þ sin <0 i 2 ðx Þ i¼1 2 2 2 A i 1 Thus H (A; W) is a concave function of lA(xi) "i (3) H1(A; W) does not change when lA(xi) is replaced by 1 lA(xi) (4) H1(A; W) is increasing function of lA ðxi Þ for 0 6 lA ðxi Þ 6 12 (5) H1(A; W) is decreasing function of lA ðxi Þ for 12 6 lA ðxi Þ 6 1 (6) H1(A; W) = 0 for lA(xi) = 0 or 1 Under these conditions, the measure H1(A; W) proposed in (2.1) is a valid measure of weighted fuzzy entropy (referred to Kapur [8]). Proceeding on similar lines, it can easily be proved that the weighted measure H2(A; W) proposed in (2.2) is a correct measure of weighted fuzzy entropy. In next section, we undertake the study of maximum weighted fuzzy entropy principle. 3. Principle of maximum weighted fuzzy entropy Here, we provide the applications of different existing as well as newly introduced weighted measures of fuzzy entropy for the study of maximum weighted fuzziness. For this study, we consider the following problems: Problem I. Maximize n X H 3 ðA; W Þ ¼ wi ½lA ðxi Þ log lA ðxi Þ þ ð1 lA ðxi ÞÞ logð1 lA ðxi ÞÞ
ð3:1Þ
i¼1
subject to the following fuzzy constraints n X i¼1 n X
lA ðxi Þ ¼ a0 ;
ð3:2Þ
and
lA ðxi Þgr ðxi Þ ¼ K
ð3:3Þ
i¼1
Consider the following Lagrangian L¼
n X
( wi ½lA ðxi Þ log lA ðxi Þ þ ð1 lA ðxi ÞÞ logð1 lA ðxi ÞÞ k1
i¼1
(
k2
n X
)
n X
) lA ðxi Þ a0
i¼1
lA ðxi Þgr ðxi Þ K
ð3:4Þ
i¼1
Thus
oL olA ðxi Þ
¼ 0 gives lA ðxi Þ ¼ 1þefk1 þk12 gr ðxi Þg=wi .
From Eqs. (3.2) and (3.3), we get n X 1 ¼ a0 fk þk 1 2 g r ðxi Þg=wi 1þe i¼1
ð3:5Þ
and n X i¼1
1 1þ
efk1 þk2 gr ðxi Þg=wi
gr ðxi Þ ¼ K
where k1, k2 can be determined from (3.3) and (3.6). It is seen that from (3.6), to every value of k2, there is a unique value of K and vice versa. Further let g1 < g2 < P < gn. When k2 ? 1, K ¼ ni¼1 gðxi Þ ¼ n g and a0 ¼ n.
ð3:6Þ
2392
O. Parkash et al. / Information Sciences 178 (2008) 2389–2395
When k2 ? 0, K ¼
Pn
gðxi Þ i¼1 1þek1 =wi
and a0 ¼
Pn
1 i¼1 1þek1 =wi .
Thus when k2 > 0 then g1 < K < g and hence H 3max ðA; W
n X Þ¼ wi i¼1
1 1 þ eðk1 þk2 gr ðxi ÞÞ=wi
log
1 1 þ eðk1 þk2 gr ðxi ÞÞ=wi
ðk1 þk2 g ðxi ÞÞ=wi r eðk1 þk2 gr ðxi ÞÞ=wi e log þ ðk þk g ðx ÞÞ=w ðk þk i i 1 2 1 2 g r ðxi ÞÞ=wi r 1þe 1þe
Since x logx is a concave function and the sum of concave functions is a also a concave function, therefore H 3max ðA; W Þ given in (3.1) is a concave function. Problem II. In this problem, we apply another weighted measure study maximum weighted fuzzy entropy principle. For this, we consider the following problem: Maximize n 1X F ðA; W Þ ¼ wi ½ð1 þ alA ðxi ÞÞ logð1 þ alA ðxi ÞÞ þ ð1 þ að1 lA ðxi ÞÞÞ logð1 þ að1 lA ðxi ÞÞÞ a i¼1 ð1 þ aÞ logð1 þ aÞ;
ð3:7Þ
a>0
subject to the set of fuzzy constraints given in (3.2) and (3.3). Consider the following Lagrangian n 1X wi ½ð1 þ alA ðxi ÞÞ logð1 þ alA ðxi ÞÞ þ ð1 þ að1 lA ðxi ÞÞÞ logð1 þ að1 lA ðxi ÞÞÞ a i¼1 ( ) ( ) n n X X ð1 þ aÞ logð1 þ aÞ þ k1 lA ðxi Þ a0 þ k2 lA ðxi Þgr ðxi Þ K
L¼
i¼1
i¼1
1þaefk1 k2 gr ðxi Þg=wi
Thus oloL ¼ 0 gives lA ðxi Þ ¼ a 1þefk1 k2 gr ðxi Þg=wi . A ðxi Þ ð Þ Applying fuzzy constraints given in (3.2) and (3.3), we get n X 1 þ a eðk1 k2 gr ðxi ÞÞ=wi ¼ a0 ; and af1 þ eðk1 k2 gr ðxi ÞÞ=wi g i¼1 n X ð1 þ aÞeðk1 þk2 gr ðxi ÞÞ=wi 1 g ðxi Þ ¼ K afeðk1 þk2 gr ðxi ÞÞ=wi þ 1g r i¼1
ð3:8Þ ð3:9Þ
Now when k2 ? 1, we have 1 P i¼1
gr ðxi Þ a
¼K
and
a0 ¼ na
i Pn h Pn k1 =wi k1 =wi Also, when k2 ! 0; a0 ¼ i¼1 1þae g ðx Þ. and K ¼ i¼1 1þae að1þek1 =wi Þ að1þek1 =wi Þ r i Thus for k2 > 0, we have n 1X aþ2 aþ2 ða þ 2Þeðk1 k2 gr ðxi ÞÞ=wi wi fF ðA; W Þgmax ¼ log þ a i¼1 1 þ eðk1 k2 gr ðxi ÞÞ=wi 1 þ eðk1 k2 gr ðxi ÞÞ=wi 1 þ eðk1 k2 gr ðxi ÞÞ=wi ða þ 2Þeðk1 k2 gr ðxi ÞÞ=wi log ð1 þ aÞ logð1 þ aÞ 1 þ eðk1 k2 gr ðxi ÞÞ=wi
ð3:10Þ
Since x logx is a concave function, we see that the weighted fuzzy entropy given in Eq. (3.10) is a concave function. Problem III. In this problem, we maximize the weighted fuzzy entropy introduced in (2.1) under the set of fuzzy constraints given in (3.2) and (3.3).
O. Parkash et al. / Information Sciences 178 (2008) 2389–2395
2393
Consider the following Lagrangian ( ) ( ) n n n X X X plA ðxi Þ pð1 lA ðxi ÞÞ þ sin 1 þ k1 L¼ wi sin lA ðxi Þ a0 þ k2 lA ðxi Þgr ðxi Þ K 2 2 i¼1 i¼1 i¼1 Differentiating Eq. (3.11) with respect to lA(xi) and equating to zero, we get " ( pffiffiffi ) # 1 4 2 sin1 lA ðxi Þ ¼ ðk1 þ k2 gr ðxi ÞÞ þ 1 2 p pwi
ð3:11Þ
Applying the constraints (3.2) and (3.3), we get " ( pffiffiffi ) # n 1X 4 2 1 sin ðk1 þ k2 gr ðxi ÞÞ þ 1 ¼ a0 2 i¼1 p pwi and
" ( pffiffiffi ) # n 1X 4 2 1 sin ðk1 þ k2 gr ðxi ÞÞ þ 1 gr ðxi Þ ¼ K 2 i¼1 p pwi
When k2 ? 0, we have " # pffiffiffi ! n 1X 4 2k1 1 sin a0 ¼ þ1 2 i¼1 p pwi and
" n 1X 4 sin1 K¼ 2 i¼1 p
# pffiffiffi ! 2k1 þ 1 gr ðxi Þ pwi
Thus when k2 > 0, we have n X wi ½sin h þ cos h 1 H 1max ðA; W Þ ¼
ð3:12Þ
i¼1 n pffiffi 1 2 p 4 sin 4 p pwi
o ðk1 þ k2 gr ðxi ÞÞ þ 1 . where h ¼ P Thus H 1max ðA; W Þ ¼ ni¼1 f ðh; wi Þ, where f ðh; wi Þ ¼ wi ðsin h þ cos h 1Þ f 0 ðh; wi Þ ¼ wi ðcos h sin hÞ; and f 00 ðh; wi Þ ¼ wi ðsin h þ cos hÞ < 0 Thus (3.13) shows that
H 1max ðA; W
ð3:13Þ
Þ is concave.
Problem IV. Here, we maximize another weighted fuzzy entropy introduced in (2.2) under the set of fuzzy constraints (3.2) and (3.3). The corresponding Lagrangian is given by ( ) ( ) n n n X X X plA ðxi Þ pð1 lA ðxi ÞÞ þ cos 1 þ k1 L¼ wi cos lA ðxi Þ a0 þ k2 lA ðxi Þgr ðxi Þ K 2 2 i¼1 i¼1 i¼1 ð3:14Þ Differentiating Eq. (3.14) with respect to lA(xi) and equating to zero, we get " ( pffiffiffi ) # 1 4 2 1 sin lA ðxi Þ ¼ ðk1 þ k2 gr ðxi ÞÞ þ 1 2 p pwi
2394
O. Parkash et al. / Information Sciences 178 (2008) 2389–2395
Applying the constraints (3.2) and (3.3), we get " ( pffiffiffi ) # n 1X 4 2 1 sin ðk1 þ k2 gr ðxi ÞÞ þ 1 ¼ a0 2 i¼1 p pwi and
" ( pffiffiffi ) # n 1X 4 2 1 sin ðk1 þ k2 gr ðxi ÞÞ þ 1 gr ðxi Þ ¼ K 2 i¼1 p pwi
Now when k2 ? 0 " n 1X 4 sin1 a0 ¼ 2 i¼1 p and
# pffiffiffi ! 2k1 þ1 pwi
" # pffiffiffi ! n 1X 4 2k1 1 sin K¼ þ 1 gr ðxi Þ 2 i¼1 p pwi
Thus when k2 > 0, we have n X wi ½cos h þ sin h 1 H 2max ðA; W Þ ¼ i¼1
Thus H 2max ðA; W Þ ¼
Pn
i¼1 wi f ðh; wi Þ,
ð3:15Þ
where
f ðh; wi Þ ¼ wi ðcos h þ sin h 1Þ f 0 ðh; wi Þ ¼ wi ðcos h sin hÞ and f 00 ðh; wi Þ ¼ wi ðsin h þ cos hÞ < 0 Thus (3.15) shows that H 2max ðA; W Þ is concave. 4. Conclusion It has been observed that there has been redundancy and overlapping in similar situations, which, if removed, can increase the efficiency of the process. The development of new parametric and non-parametric fuzzy measures of information will definitely reduce uncertainty, which will help to increase the efficiency of the process. It is therefore concluded that though many information measures have been developed, still there is scope that better measures can be developed which will find applications in a variety of fields. Keeping this in mind, we have developed two new weighted measures of fuzzy entropy and applied the results towards optimization principles. References [1] M. Belis, S. Guiasu, A quantitative–qualitative measure of information in cybernetic systems, IEEE Trans. Inform. Theory 14 (1968) 593–594. [2] Y. Chen, Properties of quasi-entropy and their applications, J. Southeast Univ. Nat. Sci. 36 (2) (2006) 222–225. [3] A. De Luca, S. Termini, A definition of non-probabilistic entropy in setting of fuzzy set theory, Inform. Contr. 20 (1971) 301–312. [4] P. Garbaczewski, Differential entropy and dynamics of uncertainty, J. Stat. Phys. 123 (2006) 315–355. [5] X.Z. Guo, X.L. Xin, Some new generalized entropy formulas of fuzzy sets, J. Northwest Univ. 36 (4) (2006) 529–532. [6] J.D. Herniter, An entropy model of brand purchase behavior, J. Marker Rev. 11 (1973) 20–29. [7] E.T. Jaynes, Information theory and statistical mechanics, Phys. Rev. 106 (1957) 620–630. [8] J.N. Kapur, Measures of Fuzzy Information, Mathematical Sciences Trust Society, New Delhi, 1997. [9] J.N. Kapur, H.K. Kesavan, The Generalized Maximum Entropy Principle (with applications), Standard Educational Press, 1987. [10] G.J. Klir, Generalized information theory: aims, results and open problems, Reliab. Eng. Syst. Safety 85 (1–3) (2004) 21–38. [11] A.K. Nanda, P. Paul, Some results on generalized residual entropy, Inform. Sci. 176 (1) (2006) 27–47. [12] O. Parkash, A new parametric measure of fuzzy entropy, Inform. Process. Manage. Uncertain. 2 (1998) 1732–1737.
O. Parkash et al. / Information Sciences 178 (2008) 2389–2395
2395
[13] O. Parkash, P.K. Sharma, Measures of fuzzy entropy and their relations, Int. J. Manage. Syst. 20 (1) (2004) 65–72. [14] O. Parkash, P.K. Sharma, Noiseless coding theorems corresponding fuzzy entropies, Southeast Asian Bull. Math. 27 (2004) 1073– 1080. [15] C.E. Shannon, A mathematical theory of communication, Bell. Syst. Tech. J. 27 (1948) 379–423, 623–659. [16] N.D. Singpurwalla, J.M. Booker, Membership functions and probability measures of fuzzy sets, J. Amer. Stat. Assoc. 99 (467) (2004) 867–889. [17] R.R. Yager, Uncertainty representation using fuzzy measures, IEEE Trans. Syst. Man Cybernet., Part B 32 (2002) 13–20. [18] L.A. Zadeh, Fuzzy sets, Inform. Contr. 8 (1965) 338–353. [19] L.A. Zadeh, Towards a perception-based theory of probabilistic reasoning with imprecise probabilities, J. Stat. Infer. 105 (2002) 233– 264. [20] L.A. Zadeh, Towards a generalized theory of uncertainty (GTU) – an outline, Inform. Sci. 172 (2005) 1–40.