On local dependence and stochastic inequalities with applications to contingency tables

On local dependence and stochastic inequalities with applications to contingency tables

Applied Mathematics and Computation 151 (2004) 801–813 www.elsevier.com/locate/amc On local dependence and stochastic inequalities with applications ...

223KB Sizes 0 Downloads 11 Views

Applied Mathematics and Computation 151 (2004) 801–813 www.elsevier.com/locate/amc

On local dependence and stochastic inequalities with applications to contingency tables Broderick O. Oluyede Department of Mathematics and Computer Science, Georgia Southern University, P.O. Box 8093, Statesboro, GA 30460-8093, USA

Abstract Inequalities, local dependence and orderings of random variables or their distribution functions are presented. The notions of local dependence, stochastic orderings are studied and compared. Inequalities and association in ordinal contingency table as well as applications of families of distribution functions satisfying the notions discussed are presented.  2003 Elsevier Inc. All rights reserved.

1. Introduction In a two-way contingency table in which row or column variables or both are ordinal variables association between the variables can be defined in various ways. The problems arising in analysis of contingency table with ordered categories usually goes beyond test for independence. Generally, a specific analysis of the pattern of association is required. In this note, concepts of association for random variables including ordinal variables are defined and compared with several other notions of dependence in the literature. See Karlin [5], Jogdeo [4], Esary and Proschan [2], Shaked and Shanthikumar [7] and references therein. It is natural to investigate whether bivariate distributions are more locally positive (negative) dependent according

E-mail address: [email protected] (B.O. Oluyede). 0096-3003/$ - see front matter  2003 Elsevier Inc. All rights reserved. doi:10.1016/S0096-3003(03)00537-X

802

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

to some positive (negative) dependence ordering as the parameter increases or decreases. In light of this, we discuss the dependence structure of families of fixed marginal bivariate distributions indexed by a real parameter. Examples and applications of families of distributions satisfying the properties of dependence are given. Also, notions of association in ordinal contingency tables are considered and studied. In Section 2, some basic notions of dependence are given and compared. Section 3 deals with association in ordinal contingency tables. In Section 4, examples and applications of families of distribution function satisfying the notions of dependence are presented.

2. Some basic notions of dependence In this section we give some basic definitions and discuss relationships among some notions of dependence presented in this paper and in the literature. Definition 2.1. Absolutely continuous random variables X and Y having a joint density function f ðx; yÞ are locally positive (negative) dependent if and only if Z x Z y F ðx; yÞf ðx; yÞ P ð 6 Þ f ðu; yÞ du f ðx; vÞ dv; ð2:1Þ 1

1

where F ðx; yÞ ¼ P ðX 6 x; Y 6 yÞ is the distribution function of the random vector ðX ; Y Þ. If (2.1) holds we say ðX ; Y Þ are locally positive (negative) dependent. This will be denoted by LPDðX ; Y Þ ðLNDðX ; Y ÞÞ or simply LPDðLNDÞ. Definition 2.2 (Yanagimoto [8]). A random vector ðX ; Y Þ or its distribution function F ðx; yÞ is said to be in (i) P ð3; 3Þ if P ða1 < X 6 a2 ; b2 < Y 6 b3 Þ  P ða2 < X 6 a3 ; b1 < Y 6 b2 Þ 6 P ða2 < X 6 a3 ; b2 < Y 6 b3 Þ  P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ for all a1 < a2 < a3 and b1 < b2 < b3 . (ii) P ð20 ; 2Þ if P ðX > a2 ; b1 < Y 6 b2 Þ  P ða1 < X 6 a2 ; Y > b2 Þ 6 P ðX > a2 ; Y > b2 Þ  P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ for all a1 < a2 and b1 < b2 .

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

803

(iii) P ð20 ; 3Þ if P ða1 < X 6 a2 ; b2 < Y 6 b3 Þ  P ðX > a2 ; b1 < Y 6 b2 Þ 6 P ðX > a2 ; b2 < Y 6 b3 Þ  P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ for all a1 < a2 and b1 < b2 < b3 . (iv) P ð2; 20 Þ if P ða1 < X 6 a2 ; Y > b2 Þ  P ðX > a2 ; b1 < Y 6 b2 Þ 6 P ðX > a2 ; Y > b2 Þ  P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ for all a1 < a2 and b1 < b2 . (v) P ð3; 20 Þ if P ða2 < X 6 a3 ; b1 < Y 6 b2 Þ  P ða1 < X 6 a2 ; Y > b2 Þ 6 P ða2 < X 6 a3 ; Y > b2 Þ  P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ for all a1 < a2 < a3 and b1 < b2 . (vi) P ð3; 200 Þ if P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ  P ða2 < X 6 a3 ; Y < b1 Þ 6 P ða2 < X 6 a3 ; b1 < Y 6 b2 Þ  P ða1 < X 6 a2 ; Y < b1 Þ for all a1 < a2 < a3 and b1 < b2 . (vii) P ð200 ; 3Þ if P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ  P ðX 6 a1 ; b2 < Y 6 b3 Þ 6 P ða1 < X 6 a2 ; b2 < Y 6 b3 Þ  P ðX 6 a1 ; b1 < Y 6 b2 Þ for all a1 < a2 and b1 < b2 < b3 . (viii) P ð200 ; 200 Þ if P ðX 6 a1 ; b1 < Y 6 b2 Þ  P ða1 < X 6 a2 ; Y 6 b1 Þ 6 P ðX 6 a1 ; Y 6 b1 Þ  P ða1 < X 6 a2 ; b1 < Y 6 b2 Þ for all a1 < a2 and b1 < b2 . Theorem 2.1. Let ðX ; Y Þ be an absolutely continuous random vector with distribution function F ðx; yÞ. (i) If ðX ; Y Þ is in any one of P ð3; 3Þ, P ð3; 200 Þ, P ð200 3Þ or P ð200 ; 200 Þ, then ðX ; Y Þ are LPD. (ii) If ðX ; Y Þ is in any one of P ð20 ; 2Þ, P ð2; 20 Þ, P ð20 ; 3Þ or P ð3; 20 Þ, then ðX ; Y Þ are HPD, where HPD(HND) is given by Z 1 Z 1 F ðx; yÞf ðx; yÞ P ð 6 Þ f ðu; yÞ du f ðx; vÞ dv; ð2:2Þ x

y

804

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

and HPD(HND) stands for hazard positive (negative) dependent, F ðx; yÞ ¼ P ðX P x; Y P yÞ is the distribution function of the random vector ðX ; Y Þ. The converse of the assertions in Theorem 1 do not hold, and can be readily verified by construction of appropriate examples. Theorem 2.2. Let ðX ; Y Þ be absolutely continuous random vector. Then ðX ; Y Þ is LPD if and only if ðaX þ b; CY þ dÞ is LPD for all a > 0, c > 0 and 1 < b; d < 1. Proof. Let F be the distribution function of ðX ; Y Þ and let X1 ¼ aX þ b, Y1 ¼ CY þ d, then by LPD Z x Z x Z y F ðx; yÞf ðx; yÞ P f ðu; yÞ du f ðu; yÞ du f ðx; vÞ dv 1

1

1

    Z xb Z yd c a 1 yd xb ; v dv f u; f du  ac 1 c a 1     xb yd 1 xb yd 6F ; f ; :  a c ac a c ()

Theorem 2.3. Let F ðx; yÞ be a strictly positive bivariate distribution function which is twice differentiable. The property that F LPD is equivalent to o2 log F ðx; yÞ P 0: ox oy Definition 2.3 (i) (Shaked [6]): An interchangeable random vector ðX ; Y Þ (that is random vectors with permutation invariant distributions) or its distribution function F is diagonal square dependent (DSD) if P ðX 2 I; Y 2 IÞ P P ðX 2 IÞP ðY 2 IÞ

ð2:3Þ

for every interval I. (ii) ðX ; Y Þ or its distribution function F is generalized diagonal square dependent (GDSD) if P ðX 2 B; Y 2 BÞ P P ðX 2 BÞP ðY 2 BÞ

ð2:4Þ

for every Borel measurable set B in R. (iii) An exchangeable random vector ðX ; Y Þ or its distribution function F is positively dependent by expansion (PDE) if F admits the expansion

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

" dF ðx; yÞ ¼ dGðxÞ dGðyÞ 1 þ

1 X

805

# ai /i ðxÞ/i ðyÞ

ð2:5Þ

a:e:;

i¼1

where G is the univariate marginal of F , f/i g is a set of functions satisfying Z 1 /i ðxÞ dGðxÞ ¼ 0; i ¼ 1; 2; . . . ; 1 1

Z

/i ðxÞ/j ðxÞ dGðxÞ ¼ dij ;

i; j ¼ 1; 2; . . .

1

Proposition 2.1. Let ðX ; Y Þ be an interchangeable random vector, then (i) (ii) (iii) (iv)

ðX ; Y Þ ðX ; Y Þ ðX ; Y Þ ðX ; Y Þ

is is is is

PDE does not imply ðX ; Y Þ is LPD. GDSD does not imply ðX ; Y Þ is LPD. LPD does not imply ðX ; Y Þ is DSD. PDM does not imply LPD.

Proof (i) See Shaked [6]. (ii) Let ðXn ; Yn Þ, n ¼ 1; 2; . . ., be discrete random vectors with the following joint distribution. Xn

Yn )1

)1 0 1

2 ðn þ 2Þ 0

2

0

1

0

1 2  n þ 2 ðn þ 2Þ2

n nþ2

1 2  n þ 2 ðn þ 2Þ2

0 2

0

ðn þ 2Þ

2

Then the vectors ðXn ; Yn Þ, n P 1, are GDSD but not LPD. (iii) Let ðX ; Y Þ be a discrete random vector with the following joint probabilities. X

Y b1

b2

b3

b1 b2 b3

0.2 0.4 0

0 0.1 0.2

0 0 0.1

806

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

Then ðX ; Y Þ is LPD but not DSD since P ðX ¼ b2 ; Y ¼ b2 Þ < P ðX ¼ b2 Þ  P ðY ¼ b2 Þ. (iv) Since PDM ) PDD ) GDSD, and by (ii) GDSD;LPD, it follows that PDM;LPD. h Proposition 2.2. For the class of 2 · 2 tables with given marginals F LPD G () F T G () F PQD G: Here we consider the dependence structure of families of fixed marginal bivariate distributions indexed by a real parameter. It is therefore natural to investigate whether bivariate distributions are more locally positive dependent according to some positive dependence ordering as the parameter increases or decreases. Theorem 2.4. For the Ali–Mikhail–Haq [1] and the Farlie–Gumbel–Morganstern [3] family of distributions 0 6 a1 < a2 6 1 implies H ðx; y; a1 Þ LPD H ðx; y1 ; a2 Þ: Proof. Since, the Ali–Mikhail–Haq [1] family of distribution is given by H1 ðxÞH2 ðyÞ ; 1 6 a 6 1; 1  aH 1 ðxÞH 2 ðyÞ o2 ln H ðx; y; ai Þ ai h1 ðxÞh2 ðyÞ ¼ ; i ¼ 1; 2; 2 ox oy ½1  ai H 1 ðxÞH 2 ðyÞ o2 ln H ðx; y; a2 Þ o2 ln H ðx; y; a1 Þ P ox oy ox oy a2 a1 () P ; where t ¼ H 1 ðxÞH 2 ðyÞ 2 ½1  a2 t ½1  a1 t2

H ðx; y; aÞ ¼

2

() a2 ½1  a1 t P a1 ½1  a2 t

2

() a2  a1 þ a21 a2 t2  a1 a22 t2 P 0 () ða2  a1 Þð1  a1 a2 t2 Þ P 0: 2

But 0 6 t2 ¼ ½H 1 ðxÞH 2 ðyÞ 6 1, 0 6 ai 6 1 and 0 6 H1 ðxÞ, H2 ðyÞ 6 1, so that 2 a1 a2 ½H 1 ðxÞH 2 ðyÞ 6 1. Hence if a2 > a1 ; H ðx; y; a1 Þ LPD H ðx; y; a2 Þ: The proof for the Farlie–Gumbel–Morganstern family of distributions is similar and is omitted. h

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

807

3. Inequalities and association for discrete variables In this section we examine association for discrete random variables and obtain useful inequalities and connections between the various notions in ordinal contingency tables. First consider the set of constraints in a 2  C contingency table. p1j p2j P ; j ¼ 1; 2; . . . ; c  1; ð3:1Þ p1;jþ1 p2;jþ1 p1j p1;jþ1 P ; j ¼ 1; 2; . . . ; c  1; ð3:2Þ pþj pþ;jþ1 p p2;k Pc 1;k P Pc ; k ¼ 1; 2; . . . ; c  1; ð3:3Þ j¼k p1;j j¼k p2;j Pk Pk p1;j j¼1 p2;j Pc j¼1 P P c ; k ¼ 1; 2; . . . ; c  1; ð3:4Þ p j¼kþ1 1;j j¼kþ1 p2;j Pk Pk j¼1 p1;j j¼1 p2;j P ; k ¼ 1; 2; . . . ; c  1; ð3:5Þ p1;kþ1 p2;kþ1 Pkþ1 Pkþ1 j¼1 p1;j j¼1 p2;j P ; k ¼ 1; 2; . . . ; c  1; ð3:6Þ p1;kþ1 p2;kþ1 Pk Pk j¼1 p1;j j¼1 p2;j P Pkþ1 ; k ¼ 1; 2; . . . ; c  1; ð3:7Þ Pkþ1 j¼1 p1;j j¼1 p2;j p2;k P Pc ; k ¼ 1; 2; . . . ; c  1; p j¼kþ1 1;j j¼kþ1 p2;j Pc Pc j¼k p1;j j¼k p2;j Pc P Pc ; k ¼ 1; 2; . . . ; c  1; j¼kþ1 p1;j j¼kþ1 p2;j Pk Pk j¼1 p1;j j¼1 p2;j P ; k ¼ 1; 2; . . . ; c  1; p1þ p2þ Pk Pk j¼1 p1;j j¼1 p2;j Pc P Pc ; k ¼ 1; 2; . . . ; c  1: j¼kþ1 p1;j j¼kþ1 p2;j Pc

p1;k

ð3:8Þ ð3:9Þ

ð3:10Þ ð3:11Þ

Remark 3.1 (i) (3.1) is equivalent to (3.2). (ii) (3.3) defines HPD for a 2  C contingency table and is equivalent to (3.8) which is equivalent to (3.9). (iii) (3.5) defines LPD for a 2  C contingency table and it is equivalent to (3.6) which is equivalent to (3.7).

808

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

(iv) (3.10) is equivalent to (3.11). (v) (3.1) implies (3.8) which implies (3.11). For an r  c contingency table, (3.1) becomes pi;j piþ1;j P pi;jþ1 piþ1;jþ1

ð3:12Þ

for i ¼ 1; . . . ; r  1, and j ¼ 1; 2; . . . ; c  1. For an r  c contingency table, we have the following set of definitions. Pc

ph;k

j¼kþ1 ph;j

phþ1;k P Pc j¼kþ1 ph;j

ð3:13Þ

for h ¼ 1; 2; . . . ; r  1 and k ¼ 1; 2; . . . ; c  1, Pk Pk j¼1 ph;j j¼1 phþ1;j Pc P Pc j¼kþ1 ph;j j¼kþ1 phþ1;j

ð3:14Þ

for h ¼ 1; 2; . . . ; r  1 and k ¼ 1; 2; . . . ; c  1. The inequalities in (3.14) defines regression dependence which is equivalent to stochastic ordering of the conditional distribution in the rows. ph;k

r c X X

r X

pi;j P

i¼hþ1 j¼kþ1

c X

pi;k

i¼kþ1

ð3:15Þ

ph;j

j¼kþ1

for h ¼ 1; 2; . . . ; r  1 and k ¼ 1; 2; . . . ; c  1 expresses HPD h X k X

phþ1;kþ1

i¼1

h X

pi;j P

j¼1

pi;kþ1

i¼1

k X

ð3:16Þ

phþ1;j

j¼1

for h ¼ 1; 2; . . . ; r  1 and k ¼ 1; 2; . . . ; c  1 expresses LPD. h X i¼1

r c X X

pi;k

pi;j P

i¼hþ1 j¼kþ1

r X

pi;k

i¼hþ1

h c X X

ð3:17Þ

pi;j

i¼1 j¼kþ1

for h ¼ 1; 2; . . . ; r  1 and k ¼ 1; 2; . . . ; c  1. h X k X i¼1

j¼1

pi;j

r c X X

pi;j P

i¼hþ1 j¼kþ1

r k X X i¼hþ1 j¼1

pi;j

h c X X

pi;j

ð3:18Þ

i¼1 j¼kþ1

for h ¼ 1; 2; . . . ; r  1 and k ¼ 1; 2; . . . ; c  1. Eq. (3.18) expresses positive quadrant dependence (PQD). pi;j piþ1;jþ1 P piþ1;j pi;jþ1

ð3:19Þ

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

809

for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1 defines likelihood ratio dependence, where pij is the probability of an observation falling in the ði; jÞ cell of the contingency table. Theorem 3.1. Likelihood ratio dependence implies LPD and LPD implies PQD. Proof. By likelihood ratio dependence, pi;j piþ1;jþ1 P pi;jþ1 piþ1;j for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1. This implies that j i X X

ph;k  piþ1;jþ1 P

h¼1 k¼1

j1 i1 X X

ph;k  piþ1;jþ1 þ pi;jþ1 piþ1;j

h¼1 k¼1

P

i X

ph;jþ1

h¼1

j X

piþ1;k

k¼1

for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1. To prove that LPD implies PQD, we make use of the following ratio property: If a c 6 b d

then

a aþc c 6 6 ; b bþd d

a > 0, b > 0, c > 0, d > 0. By LPD, j i X X

ph;k  piþ1;jþ1 P

h¼1 k¼1

i X

ph;jþ1

h¼1

j X

for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1. Let becomes j X k¼1

Ck piþ1;jþ1 P Cjþ1

j X

ð3:20Þ

piþ1;k

k¼1

piþ1;k

k¼1

for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1; Pj Ck Cjþ1 P () Pj k¼1 piþ1;jþ1 k¼1 piþ1;k for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1, Pj Pjþ1 k¼1 Ck k¼1 Ck ) Pj P Pjþ1 k¼1 piþ1;k k¼1 piþ1;k for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1,

Pi

h¼1

ph;k ¼ Ck . Then (3.20)

810

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

Pj Pi phþ k¼1 Ck ) Pj P h¼1 pðiþ1Þþ k¼1 piþ1;j for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1. Let j X

, Ck

k¼1

j X

piþ1;k ¼ r1

and

i X

phþ =pðiþ1Þþ ¼ r2 :

h¼1

k¼1

Then j X

C k ¼ r1

j X

k¼1

piþ1;k ;

i X

k¼1

phþ ¼ r2 pðiþ1Þþ

h¼1

and i X

phþ 

j X

Ck ¼

k¼1

h¼1

c X

Ck ¼

k¼jþ1

6 pðiþ1Þþ 

i c X X

ph;k ¼ r2 pðiþ1Þþ  r1

i X

h¼1 k¼jþ1 i X

h¼1

!

piþ1;k

¼ r2

h¼1

c X

piþ1;k :

k¼jþ1

Hence Pi Pc Pi Pj h¼1 k¼jþ1 ph;k h¼1 k¼1 ph;k Pc 6 r2 6 r1 ¼ P j k¼jþ1 piþ1;k k¼1 piþ1;k for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1, and Pj Piþ1 Pj Pi Pj ph;k k¼1 pþk h¼1 k¼1 ph;k Pc 6    6 Piþ1 Pc 6 Pi h¼1Pc k¼1 : p k¼jþ1 þk h¼1 k¼jþ1 ph;k h¼1 k¼jþ1 ph;k So that Pi

h¼1

Pj

Pj

k¼1

k¼1

pþk

ph;k

Pi Pc ph;k h¼1 Pc k¼jþ1 P k¼jþ1 pþkl

for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1.

piþ1;k

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

811

Hence Pi

h¼1

Pj

Pj

ph;k

k¼1

k¼1

pþk

Pi P

h¼1

Pc

k¼1

1

ph;k

¼

i X

Pi

h¼1

Pc

Pc

phþ P

k¼jþ1

ph;k

k¼jþ1 pþk

h¼1

for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1, Consequently, j i X X h¼1 k¼1

ph;k

r c X X

ph;k P

h¼iþ1 k¼jþ1

i c X X h¼1 k¼jþ1

ph;k

j r X X

ph;k

h¼iþ1 k¼1

for i ¼ 1; 2; . . . ; r  1 and j ¼ 1; 2; . . . ; c  1. h Corollary 3.1. Likelihood ratio dependence implies HPD which implies PQD. Proof. The proof is similar to that of Theorem 3.1.

h

Remark 3.2. The converses of Theorem 3.1 and Corollary 3.1 do not hold. Summarizing, we have the following implications. (A) (3.1) $ (3.2), (3.1) fi (3.3), (3.3) $ (3.8) $ (3.9) $ (3.11) $ (3.10). (B) (3.5) $ (3.6) $ (3.7). (C) (3.19) fi (3.13) fi (3.14) fi (3.18), (3.19) fi (3.15) fi (3.17) fi (3.18), (3.13) fi (3.15) fi (3.17) fi (3.18), (3.19) fi (3.16) fi (3.18).

4. Examples and applications In this section we present some examples of families of distributions satisfying the notions of dependence and association discussed in this paper. Example 4.1 (Morganstern–Gumbel–Farlie distribution [3]). Consider the family of bivariate distributions H ðx; yÞ ¼ F ðxÞGðyÞ½1 þ að1  F ðxÞÞð1  GðyÞÞ; where jaj 6 1 and F ðxÞ and GðyÞ are continuous distribution functions. The bivariate family of distribution functions given above is locally positive dependent if and only if 0 6 a 6 1.

812

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

Since ln H ðx; yÞ ¼ ln F ðxÞ þ ln GðyÞ þ ln½1 þ að1  F ðxÞÞð1  GðyÞÞ; o2 ln H ðx; yÞ ox oy     o o o gðyÞ að1  F ðxÞÞgðuÞ ln H ðx; yÞ  ¼ ox oy ox GðyÞ 1 þ að1  F ðxÞÞð1  GðyÞÞ ¼

½1 þ að1  F ðxÞÞð1  GðyÞÞðagðyÞf ðxÞÞ  ½að1  F ðxÞÞgðyÞ½að1  GðyÞÞðf ðxÞÞ ½1 þ að1  F ðxÞÞð1  GðyÞÞ

2

P0 () af ðxÞgðyÞ þ a2 ð1  F ðxÞÞð1  GðyÞÞf ðxÞgðyÞ  a2 ð1  F ðxÞÞð1  GðyÞÞf ðxÞgðyÞ P0 () af ðxÞgðyÞ P 0:

Since f and g are probability density functions f ðxÞ P 0 for all x, and gðyÞ P 0 for all y, it follows that o2 ln H ðx; yÞ P0 ox oy () a P 0: Example 4.2 (LPD bivariate mixture). Let ðXi ; Yi Þ, i ¼ 1; 2 be an independent pair of random variables with finite expectations and densities fi ðxÞ and gi ðyÞ, i ¼ 1; 2. Let ðX ; Y Þ be a mixture of ðX1 ; Y1 Þ and ðX2 ; Y2 Þ with density hðx; yÞ ¼ pf1 ðxÞg1 ðyÞ þ ð1  pÞf2 ðyÞg2 ðyÞ;

0 < p < 1:

The pair ðX ; Y Þ is equal to the independent pair ðX1 ; Y1 Þ with probability p and is equal to the independent pair ðX2 ; Y2 Þ with probability 1  p. The mixture density hðx; yÞ or the random variable X and Y are LPD if and only if ½f2 ðxÞ  f1 ðxÞ½g2 ðyÞ  g1 ðyÞ P 0

with probability 1:

By definition, X and Y are LPD if and only if Z y Z x hðu; yÞ du hðx; vÞ dv: H ðx; yÞhðx; yÞ P 1

1

That is, Z

x

Z

y

ðpf1 ðuÞg1 ðvÞ þ ð1  pÞf2 ðuÞg2 ðvÞÞ du dv hðx; yÞ 1 Z 1 Z y x P ðpf1 ðuÞg1 ðyÞ þ ð1  pÞf2 ðuÞg2 ðuÞÞ du  ðpf1 ðxÞg1 ðvÞ 1

þ ð1  pÞf2 ðxÞg2 ðvÞÞ dv:

1

B.O. Oluyede / Appl. Math. Comput. 151 (2004) 801–813

813

Since 0 6 F1 , F2 ; G1 ; G2 6 1 we have f2 ðxÞg2 ðyÞ þ f1 ðxÞg1 ðyÞ  f2 ðxÞg1 ðyÞ  f1 ðxÞg2 ðyÞ P 0: Consequently, ðf2 ðxÞ  f1 ðxÞÞðg2 ðyÞ  g1 ðyÞÞ P 0: Example 4.3. Let X1 , X2 , X3 be the absolutely continuous random variables. Then the vectors (i) ðmaxðX1 ; X2 Þ; minðX1 ; X2 ; X3 ÞÞ (ii) ðmaxðX1 ; X2 ; X3 Þ; minðX1 ; X2 ÞÞ are LPD. Let Y1 ¼ maxðX1 ; X2 Þ, Y2 ¼ minðX1 ; X2 ; X3 Þ. Then ðY1 ; Y2 Þ is in P ð3; 3Þ ) ðY1 ; Y2 Þ is LPD.

References [1] M. Ali, N. Mikhail, M. Haq, A class of bivariate distributions including the bivariate logistic, J. Multivar. Anal. 8 (1978) 405–412. [2] J.D. Esary, F. Proschan, Relationship among some concepts of bivariate dependence, Ann. Math. Statist. 43 (1972) 651–655. [3] D.J. Farlie, The performance of some correlation coefficients for a general bivariate distribution, Biometrika 47 (1960) 307–323. [4] K. Jogdeo, Concepts of Dependence, in: S. Kotz, N.L. Johnson (Eds.), Encylopedia of Statistical Sciences, Wiley, New York, vol. 2, 1982, pp. 324–334. [5] S. Karlin, in: Total Positivity, vol. I, Stanford University Press, Stanford, 1968. [6] M. Shaked, A concept of positive dependence for exchangeable random variables, Ann. Statist. 5 (1979) 505–515. [7] M. Shaked, J.G. Shanthikumar, Stochastic Order and Their Applications, Academic Press, New York, 1994. [8] T. Yanagimoto, Families of positively dependent random variables, Ann. Inst. Statist. Math. 24 (1972) 559–573.