An axiomatic characterization of nonadditive information improvement

An axiomatic characterization of nonadditive information improvement

INFORMATION SCIENCES 21, 187-194 (1980) 187 An Axiomatic Characterization of Nonadditive Information Improvement D. S. HOODA Department of Mathema...

307KB Sizes 9 Downloads 131 Views

INFORMATION

SCIENCES

21, 187-194 (1980)

187

An Axiomatic Characterization of Nonadditive Information Improvement D. S. HOODA Department of Mathematics and Statistics, Haryana Agricultural University, Hissar 125004, India

Communicated

by J. M. Richardson

ABSTRACT

A generalized nonadditive information improvement satisfying nonadditivity and containing the parameters a,/3,y has been axiomatically characterized by a general method. The particular cases of the new measure have been also studied.

1.

INTRODUCTION

Let A” denote the set of all finite discrete generalized tions, defined as follows:

2 Pk
Pk>@

probability

distribu-

n= 1,2 )....

k-l

If P, Q, and R are three probability measure of information improvement

I(PIQIR)=

distributions is given by

i

from A,, then Theil’s [13]

PJW+

i-l

(1.1)

,

where Q is the prediction probability distribution and R is the revised probability distribution on the basis of the probability distribution P. The measure (1.1) has found many applications in economic analysis. This measure has also been characterized by Sharma and Autar [8]. A generalization of the measure (1.1) studied by Kannappan and Rathie [4] and Sharma and Autar [9] is given by Z,(P(QlR)=(2a-‘-

l)-’

QElsevier North Holland, Inc., 1980 52 Vanderbilt Ave., New York, NY 10017

5 pi( fai-l I

I],

cufl,

(1.2)

0020-0255/80/06187-08501.75

188

D. S, HOODA

which reduces to (1.1) when a4 1. Some generalized measures of information ‘improvement have been investigated by Taneja 11l] and subadditive measures of isolation improvement by Gupta and Hooda [Z]. The measures satisfying nonadditivity,

~~~(~lel~)f(~‘lQ’l~‘),

(1.3)

P’= (p;,p$,. . . ,pA), and P + P’= have been studied by Sharma and Soni [lo] (p,p;,,..,p,P:,;...;pnp;,..., and Taneja [12] using the sum property. Mittal [S] studied information improvement measures satisfying (1.3) and having a mean value representation. In this paper a generalized nonadditive isolation improvement satisfying (1.3) and containing the parameters o,&y has been characterized axiomatically by a general method. The particular cases of the new measure have been also studied.

where

P = (p,,p,,

k+O,

. . . ,p,), p,&),

‘L. CHARACTERIZATION IMPROVEMENT

OF NONADDITIVE

INFO~A~ON

Let Z(P (Q IR) be an information improvement measure such that P, Q, R E An, n= 1,2,3 ,... . We assume that Z((PIQ IR) satisfies the following postulates: POSTULATE(il. Z({~]l{l}l{l}), Z({l}l{~}l{l]), and Z(~l}l{l~l~q}) tinuous functions of p, q, and r respectively for p,q,r ~(0,1], POSTIJLLTE (ii). For x, y, z ~(0,1]

are con-

and n > 1

where k#O is some constant. We prove the following theorem: THEOREM 1. The measure ~(~p~l{q~l~r}) of information improvement p, q, r E (0, l] satisfying Postulates (i) and (ii) is given by

for

NONADDITIVE

INFORMATION

189

IMPROVEMENT

where 6, A, p are arbitrary constants. Proof. Let (2.2)

I( { Postulate

(ii) for n = 1 gives (2.3)

f(px9qy9rz)=f(p,q,r)+f(4y,z)+kf(p,q,r)f(x,y,z)

and f(p,q,r)=f(p,q,

l)+f(l,

l,r)+kf(p,q,

l)f(l,

l)+kf(p,

Al)f(l,q,

1,r).

(2.4)

Further, f(p,q,

l)=f(p,

Ll)+f(l,q,

1).

(2.5)

Setting p = q = 1, x = y = 1 in (2.3), we have f(191,rz)=f(l,l,r)+f(l,l,z)+kf(l,1,r)f(l,1,z)

(2.6)

which can be written as 1+kf(l,l,rz)=[l+kf(l,l,r)][l+kf(l,l,z)].

(2.7)

Setting I+ kf(1, l,r)= H(r) in (2.6), we get H(rz) = H(r)H(z).

(2.8)

By Postulate (i) f( 1, 1,r) is a continuous function of r; hence H(r) is also a continuous function of r, and thus the only nonzero continuous solutions of (2.8) (see Aczel [l, p. 411) are of the form H(r) = rp, p being some arbitrary

constant.

Therefore

f(l,l,r)=y,

(2.9)

D. S. HOODA

190 where p and k#O are arbitrary

constants.

Similarly

(2.10)

f(P, 1,I)=

ff$,

(2.11)

where X, S, and k#O are arbitrary constants. Putting (2.10) and (2.11) in (2.5), we get

f(p,q,l)=V.

(2.12)

Finally putting (2.12) and (2.9) in (2.4), we have

q6qA;e-1

f(p,q,r)= where X, c, 8, and k # 0 are arbitrary 1, a# 1, (2.13) gives (2.1). Now we characterize in the next theorem.

constants.

the nonadditive

In particular

(2.13) taking k = 2’-” -

measure of information

TXSEOREM 2. The no~~tj~ measure of i~fo~t~o~ ~aii~~ing P~t~Iafe~ (i) and (ii) and

j~r~~nj

POSTULATE(iii). 1({f}~{f}~{l>)=1, PosruUrn

(iv). Z({ l}l{~}l{f})=O,

PoslwLATE (v).

where P >O is such that W,(P)=

2 w,(Pi)-P,B+p,B+ i-1

*I* +pp<

1,

improvement Z(PIQ 1R)

NONADDITIVE

INFOR~TION

IMPROVEMENT

191

is given by

(2.14)

Proof. By Postulate (iii), (2.13) gives a=l+S+A=y+A

(2.15)

(say).

By Postulate (iv), (2.13) gives p+h=O.

(2.16)

Now (2.13) in view of (2.15) and (2.16) becomes

“‘-‘::‘s”r”l

Z((p),(q),(r))=



a#l,

afy.

(2.17)

By using (2.17) in Postulate (v) we get Z({P~~P~~...rP”}l{~~r~~~...~~“}l{~~~~*~.~.~~~}) ,- 5 p,a’r-l = i-i

2

(

1

$,p,B

y-a

) -1

(2+-l)-‘,

a#l,

spy;

(2.18)



‘ hence the theorem is proved. The measure (2.14) may be called the information improvement of order a and type ( p, y) and will be denoted by Z,’a7)(P 1Q 1R). 3.

PARTICULAR

CASES <

(a)

$,Pi( 5)‘~”’

- 1 (2*--a-l),

Z:'*"(PlQIR)= iii i-l

afl.

(3.1)

D. S. HOODA

192 Further

$ limZ~‘~‘)(P]Q]R)=

PilOg(

2)

‘=’



,

(3.2)

n-1 StPi i-1

which is Theil’s [13] measure probability distributions.

(b)

of information

improvement

- 1 (21-a-

Z;~J'(PlQlQ)=

1)-l,

for generalized

o#l,

(3.3)

L!

c

i-l

is the nonadditive measure of entropy of order (Y and type (/3,y) generalized probability distribution. When a = y, (3.3) reduces to

(2’_a-

I

o#l,

(3.4)

i-l

which is a nonadditive measure Further taking /3 = 1, we get

of entropy

+-

Z;'+)(pIQIR)=

1)-l,

for a

studied

by Patni

- 1 (2l--ol- 1)-l,

and Jam [6].

cr#l,

(3.5)

2 Pi

i-l

which is Havrda and Charvat’s [3] a-entropy distribution, studied by Vajda [ 141.

for the generalized

probability

ripp+B-lqiOL_r’ (4

Z~fl~"(plQlp)=

i-1

- 1 (2r-a-5P! i-l

,

1)-l,

cr#l,

(3.6)

NONADDITIVE

INFORMATION

is the nonadditive relative taking y= 1, (3.6) becomes

@.‘)(plQlp)=

IMPROVEMENT

information

i-l

193

of order CYand type ( p, y). Further

- 1 (2i-a-l)-‘,

cr#l,

(3.7)

LB i-l which is the nonadditive relative information of order (r and type p studied by Patni and Jain [6]. By taking /?= 1, (3.7) reduces to

$ p,“q,“-1 Z:‘*‘)(PlQlP)=

i-’ n

- 1 (2*-a-

1)-l,

cu#l,

(3.8)

X Pi

i-1

which is the relative information

I

of order (Ystudied by Rathie and Kamrappan

[71. The author wishes to thank Dr. H. C. Gupta, Department of Mathematics, Rajdhani College, New Delhi for discussion and guidance in the preparation of this paper. He is also thankful to Dr. R. K. Tuteja, Department of Mathematics, Kurukshetra University Kurukshetra for his he& and encouragement. Thanks are also due to Professor C. Mohan, K. U. Kurukshetra, and Professor 0. P. Srivastava, H. A. U., Hissar, for providing facilities in their departments. REFERENCES 1. J. Aczel, Lectures on Functiod Equation and their Applications, Academic, New York, 1966. 2. H. C. Gupta and D. S. Hooda, Sub-additive measures of information improvement, to 3. J. Havrda and F. Charvat, Quantification method of classification processes, the concept of structural a-entropy, Kybernetika 3:S.30-35 (1967). Pl. Kannappan and P. N. Rathie, Application of a functional equation to information theory, Ann PoIon. Math. 26:95-101 (1972) D. P. Mittal, On Non-additive information improvement, Elektron. Informationwerarbeit. Kybemetik 12:187-192 (1976). G. C. Patni and K. C. Jain, On axiomatic characterization of some non-additive measures of information, Metrika 24:S.23-34 (1977). P. N. Rathie and Pl. Kannappan, A directed divergence function of type 8, Information and Control 20:38-45 (1972).

D. S. HOODA

194 8. B. D.

9. 10. 11. 12.

Sharma and R Autar, Information improvement functions, Economtricu 42:103-112 (1974). B. D. Sharma and R. Autar, Information improvement, Met&u 22:27-34 (1975). B. D. Sharma and R. S. Soni, Additive and non-additive measures of comparison and I.L., Elektron. Informations wrarbeit. Kybernetik 11:489-504 (1975). I. J. Taneja, On axiomatic characterization of information improvement and its gene& ization, J. Comb. Inf. and S_wfemsSci. 1(2):69-79 (1976). I. J. Taneja, A generalized functional equation of three variables in information theory,

Funkcial. Ekwc. 19(2): 139-147 (1976). 13. H. Theil, Economics and Information Theory, North-Holland, Amsterdam, 1967. 14. I. Vajda, Axioms for a-entropy of a generalized probability scheme, Kybernetika 4:S.l05-112 (lq68). Receiwd

April 1979