Bayesian analysis of the mixture of exponential failure distributions

Bayesian analysis of the mixture of exponential failure distributions

Microelectron. Reliab., Vol. 33, No. 8, pp. I 113-1117, 1993. 0026-2714/9356.00+ .00 © 1993PergamonPress Ltd Printed in Great Britain. BAYESIAN ANA...

276KB Sizes 1 Downloads 87 Views

Microelectron. Reliab., Vol. 33, No. 8, pp. I 113-1117, 1993.

0026-2714/9356.00+ .00 © 1993PergamonPress Ltd

Printed in Great Britain.

BAYESIAN ANALYSIS OF THE MIXTURE OF EXPONENTIAL FAILURE DISTRIBUTIONS N A N D KISHORE SINGH and S A m R K. BHATTACHARYA Department of Mathematics and Statistics, Allahabad University, Allahabad 211 002, India

(Received 11 March 1992) Abstract--In this paper, a Bayesian reliability analysis is carded out for the mixture of two exponential failure distributions on the basis of failure time data x, < x2 <- • - < x,, for a preassigned r, and (n - r) survivors from this mixed failure model. Under the assumptions of the squared error loss function (SELF) and suitable prior densities on the underlying parameter space, exact results for the Bayes estimators of the mean life and the reliability function are obtained.

1. INTRODUCTION Suppose the life-length X in a population of units is a random variable having the p.d.f.

f(xl01, 02 ) = [(p /Ol )e -x/°' + (1 - p )/O2 e-X/°2], (1) where 0~ 0 , 02>0, 0~


that they can be attributed to a specific subpopulation with parameter 0i ( i = 1, 2). In such a situation, the complete sample considered by them can be divided into two subsamples and on the basis of these subsamples 01 and 02 can be estimated separately. Thus, in our opinion, the work in Ref. [4] on the mixture of an exponential population is at best artificial. We are also aware of the recent work [5] on the Bayesian analysis of mixtures using certain approximation methods. This paper provides exact results for the Bayesian analysis of the bonafide mixture, that is, for situations wherein the units remain unidentified with respect to the subpopulations even after failure. Moreover, we have treated the more general case of a censored sample and relaxed the assumption that the mixing proportion p is known (see Section 5).

2. PRELIMINARIES AND NOTATION (a) The mean life and the reliability function for the mixed exponential failure model (1) are given, respectively, by the formulae U =[pO~+(l --p)02],

(2)

R =-R(t)=Ipexp(-t/01) + ( 1 - - p ) e x p ( - - t / 0 2 ) ] (t>O).

(3)

(b) As for the method of sampling, n units from (1) are placed on a life test which is terminated when a preassigned number r (0 < r ~
1113

1114

N.K. SINGH and S. K. BHATTACHARYA

Case I

We have also used the notations: l

For i = 1, 2, 0~ follows an inverted gamma prior density, that is, I G P D (a~,p~) and 0~ and 02 are a priori independent, so that the joint prior density is specified by

Vk, ` = U k -

Z Xis' S=I

(121

l

w~,, = ~ x,, + kx•,

03/

s=l

OF(p,+I) "exp - -

g(O1,02)o C

I a~tOi '

(4)

where k = 0 , 1 , 2 . . . . . ( n - r ) , l = 1 , 2 . . . . . r. It is obvious that Uk, Vk.~ and Wk.l are positive.

where 0<01
0<02<0o,

3. BAYESIAN ESTIMATION FOR CASE I

a~,az,pj,pz>O.

Case H

The likelihood function (LF) corresponding to the available data can be easily obtained as

F o r i = 1, 2, 0~ follows a gamma prior density, that is G P D (fl~, v~), and 01 and 02 are apriori independent, so that the joint prior density is given by

l(01,02) = [n!/(n - r)!l"

f(xil01,02)

x [p e x p ( - x , / O i ) + (1 - p ) g(01,02)oc

0~.' I • exp -

fl~O~ ,

(5)

i=1

where

× ~"exp(--Tr/Ol)]/O~'[i=(ll{l+(1--p)/p

0<(01

< O0 ,

0<02<00

,

fll , f12, I)l , ~)2 ~> 0.

F o r our calculations, we shall need the following results: (i)

f0oy -A e:my dy = [F(A f0

1--p

(6)

y'-'e-~Y-"/Ydy=2(u/z)~/2.K,(2x//~),

(7)

provided that Re z > 0, Re u > 0, where K~(. ) is a modified Bessel function (cf. Ref. [7], p. 951 and p. 340, formula 9). (iii) F o r any real b > 0, f i [1 + b e x p { - x ~ ( 1 / 0 2 - 1/0~)}] i=1

=1+

bt

exp - ( 1 / 0 2 - 1 / 0 j )

I=1

~x~,

,

i1=1 i2=1

1 __ 1__~1"-"

o.:,

(0 < 01 , 02 < •).

(14)

We use a binomial expansion for the last factor in equation (14) and the algebraic relation (8) for the product term [I-IT=1 { " }] to obtain

x [ 0 { ~e (Uk/OO-(kx,/02) L± (t)/1 - n V + ,=l ~ 2 - - ~ L ) Ol~r-a02 t

(8)

x e-iV.'°,'-'w./°"l.

s=l

where we have used for l = 1 , 2 , . . . , r, the following symbol for the multiple summation under consideration: (/) ~- ~ ~ ... ~ i l < i 2 < . . . < i ,. (9) •

,O:O.exp(x(

1)/B A -l] (A > I , B > 0 ) .

(ii)

x exp(--xr/OOl"-r= [n!/(n -- r)!l

A (0<01,02 < oo).

(15)

This last factor is combined with the prior (4) in the usual manner to obtain the Bayesian posterior density

//=l

(d) Besides the notation for the multiple summation at (9), we have used some more notations in the paper. The usual statistic referred to as the total time on test (TTT) is defined as

g*(Ol, 02) = C~ 1" k=O

k

1

X [ 0 - ( r + at + 1). e--(Uk + al )/01 I_

• (/)

x 0~<~2+, e
+ (n

--

r)Xr,

we

have

(10)

l=1 *

i

in terms of which, k = 0 , 1. . . . . (n - r ) ,

U k = ( T r -- kxr).

defined

for ( l 1)

x(l;P)lOl(r

l+pl+l)02(l+02+l)

xe..,+o,:,-'w.,+°:d.]

(16)

1115

Mixture of exponential failure distributions where 0 < 0, < oo, 0 < 02 <~ oo, and c l ' is the normalization constant that renders the integral of g* over {(0t, 02): 0 < 0,, 02 < oo} equal to unity. Under the assumption of the SELF, the Bayes estimators 01, ~2,/i and R of 01,02,/z and R, and also their posterior variances Vl, V2, V, and VR, respectively, can be conveniently expressed in terms of the integral

4. BAYESIAN ESTIMATION FOR CASE H

For this situation, the likelihood function (15) is combined with the prior density (5) to obtain the Bayesian posterior density

~r(n;r)p"-k(1--p)k

g*(Ol' 02) ~-- C21 k=O \

x I 0 ~ ' - ' - '0~2- ' exp{-(Uk/01

I(st , s2, ml , m2)

1

=f:f:O;'O~exp[--t(ml/Ol+mJOO] x [Clg*(01,02) ] d0,, d02,

+kxr/02+

(29)

for s,, s2, mr, m2 e {0, 1, 2}. We substitute for g* from (28) in (29) and integrate by using (7) to obtain

t--;-)

×

(28)

+m2/O2)][C2g*(01, 02)] d01 d02,

+ ~ ~ {I -py

F(I -- S2 + p~).2+.;1j '

"]- fl, O, "]-/boO}

J(Sl,S2,m,,m2)=f:f:Oi'O~2exp[--t(ml/Ol

F (P2 -- S2) (kx, + a 2 + m2 t) p2-'2

F(r --I --sl + p , ) (Vk,t + a, + ml t f - t- ~,+p,

l, 1

where 0 < 01 < or, 0 < 02 < ~ , and C~l is the normalization constant, making g* a p.d.f. Again our estimators and the desired posterior variances can be conveniently expressed in terms of the integral

I F(r + P l - - s t ) x -(Ul+at+mlt),+pl_~l

×

*

x exp{--(Vk.t/O, + Wk,t/02

for sl, s2, ml, m Ee {0, 1, 2}. We substitute for g* from (16) in (17) and integrate by using (6) to obtain

×

1=,

x [(1 --p)/p]tO~' +t-,- lOT-t-t

(17)

l ( s , , s 2 , m . , m 2 ) = \=Ok ~ (n ; r)p n-k(1--p)k

fl, O, + fl202)} "{- ~ ~.

J(s t , s 2, m,, m2) = 4

S

.-k(1 _ p)k

k=O\

[(Uk + m l t \ l (',+',-0/2

)J

(18)

(2x/ill [Uk + m, t]) whenever the gamma function F ( . ) is defined, that is, its argument is positive. Now the final results for the Bayes estimators and the desired posterior variances can be obtained as Ct = I(0, O, O, 0)

(19)

~t = Ci-t I( l, 0, 0, 0)

(20)

02 = C{-ll(O, 1, O, 0),

(21)

Ii = Ctt[pI(l, O, O, O) + (1 - p)t(O, 1. O, 0)1, (22) = C{-l[pI(O, O, 1, O) + (1 -p)I(O. O, O, 1)1. (23) V, = C i- 1•(2. O, O. O) - (~l)2,

(24)

V2 = C {" 1(0, 2, O, O) - (O2)2,

(25)

Vu = C?' [p2I(2. 0, 0, O) + 2p(1 - - p ) l ( l , 1, 0, O) + (1 --p)2I(O, 2, O. 0)] -- (1i)2, (26) VR = Ci-I [p21(0, 0, 2, O) + 2p(1 -- p)I(0, 0, 1, 1) + (1 -p)2I(0, 0, 0, 2)] -- (~)2.

(27)

x\

(kx, + m2t'~ tv2+'2),2 -fl~2 j K'2+'2

(2~/fl2Ikx, + m2/l) +

~[(1--p)/P] I Vk,l ml t. (v,+~,+t-r)/2 I=t*

x K,~ +,. +,_ ~(2x/fll [Irk.t + ml t])

(Wk,,+ rnEt'~"2+n-a/2

×k # -1 x K,2+n_t(2~/fl2[Wkj + m2t]).

(30)

Now the final results for the Bayes estimators and the desired posterior variances, using the same notations with double hats for estimators and primes for variances, can be obtained as C2 = J(O, O, O, 0),

(31)

~, = c ; ' J ( l , o, o, 0),

(32)

~2 = c ; l j ( O, 1, o, 0),

(33)

~t = c~l[pJ(1, o, o, o) + (1 -p)J(O, 1, o, 0)], (34)

1116

N. K. SINGH and S. K. BHATTACHARYA

R = Cfl[pJ(0,0, 1,0)+(1 - p ) d ( 0 , 0 , 0 , 1)], (35) Vl = C f 1J(2, 0, 0, 0) - (~i)2,

(36)

V~ = C fIJ(0, 2, 0, 0) - (~2)2,

(37)

V~, = Cfl[p2d(2, 0, 0, 0) + 2p(l - p)J(l, 1, 0, 0)

Now this likelihood function is combined with the prior (41) to obtain the following Bayesian posterior density

/

g . ( O , , O 2 , p ) = C f t ~ n - r [p.-k+r-I kffio\ k / X (I __p)k+ ~s-101(, +pt + l)0fOrz+ I)]

+ (I -p)2J(O, 2, 0, 0)] - (~)2, (38) x

V'R = CfI[p2J(O, 0, 2, 0) + 2p(l -p)J(O, 0, 1, 1) + (1 -p)2J(O, O, O, 2)] - (~)2.

exp[-- (Uk -I- a I )/01 - - (kx~ + a2)/02]

• (t) + E X P n-k-I+'+l Ill *

(39)

x (1 _p)k+t+6-IO?-(,-t+pt- 1) 5. ESTIMATION WHEN p IS UNKNOWN

X 0 fit+ p: + 1). exp[--(Vk, t + at )/01

Until n o w , we have assumed that the mixing

proportion p was known and that our final results depended on this known value of p. In this section, we shall relax this assumption and indicate briefly how to handle the problem of estimation, when p is also unknown. In such a situation, we have to assume a suitable prior distribution for p. Towards this end, let us consider a beta prior specified by the p.d.f.

g ( p ) ~ : f - 1 ( 1 _p)6-1(0 ~

0). (40) We also assume that p is a priori independent of (01, 02), which has a joint prior p.d.f. (4). Hence, the vector (01, 02,p) has a joint prior p.d.f.

-- (Wk,t + a2)/02 ],

(43)

where 0 < 01 < ~ , 0 < 05 < oo, 0 ~


= ff fo folO'O2expt-"m,/¢+ m:2)]

g( Ol, 02, p )oCL~l 070"+1) 1

xp/(1 _ p )l-y[C3g,(01, 02,p)] dO1 dos dp, (44) x e x p { - - / ~ l ~ } { p ' - l ( 1 - - p ) ~ - I },

(41)

where 0 < 01 < oo, 0 < 02 < ~ , 0 ~


where Sl, s2, ml, m2,j ~ {0, l}. We substitute for g* from (43) into (44) and integrate by using (6) to obtain

()

~(Sl,S2,ml,m2,j) = ~

k=O

n - , {n -- r \

l(01,02,p)ocLt'"

k

?"-k(l--p)k

+

05 )

x

x 0i-('-o0f t

I-1

xexp(

Vkj 01

k

x r B(n - k + 7 + j , k + 6 - j + 1)F(r - s l + p t )

L 01

n--r

r

(kx,

+

(P2- s~) a 2 + m 2t) p~-,2

~ (O B ( n _ k _ l + y + j , k +l+6_j+l) + I-l*~" (Vk,t+a~ + m l t ).... -t+pl

W,,('~], 02 / J

×

(0<01,02< o0,0<~p <~1).

(Uk + al + mlt) "-',+pl

F(r -- s I -- l + Pl )F(P2 -- s2 + 1)] ~k,' +--a~ -'-~-g-' _l"

(45)

(42) A separate integration of (6) yields the normalization constant:

-( )E

C3= ~. n - r k=o k

B(n-k+~,k+~)r(r+pOrcoD (t~ + al)'+Pl . (kx, + a2)p2

+ £ £B(n-k-l+y,k +l+6)r(r-l+pl)r(p2+l)]

(46)

Mixture of exponential failure distributions U n d e r the assumption of the SELF, now the Bayes estimator #* of the mean life #, and the Bayes estimator R* of the reliability function R can easily be obtained in terms of the integral ~ at (44). The final results are given below:

3.

4. #* -- C31[~(1, 0, 0, 0, 1) + ~(0, 1, 0, 0, 0)],

(47)

R*=Cfm[~(O,O,I,O,I)+~(O,O,O,I,0)].

(48)

REFERENCES

1. B. Epstein and M. Sobel, Life testing, J. Am. Statist. Assoc. 48, 486-502 (1953). 2. W. Mendenhall and R. J. Hader, Estimation of parameters of mixed exponentially distributed failure time

5. 6. 7.

1117

distributions from censored failure data, Biometrika 45, 5O4-52O 0958). G. M. Tallis and R. Light, The use of fractional moments for estimating the parameters of a mixed exponential distribution, Technometrics 10, 161-175 (1968). M. Pandey and S. K. Upadhyay, Bayesian inference in mixtures of two exponentials, Microelectron. Reliab. 28, 217-221 (1988). S. Crawford, An approximate Bayesian analysis of finite mixture distributions. Ph.D. thesis, Department of Statistics, Carnegie Mellon University (1988). S. K. Bhattacharya, Bayesian approach to life testing and reliability estimation, J. Am. Statist. Assoc. 62, 48~2 (1967). I. S. Gradshtein and I. M. Ryzhik, Table of Integrals, Series, and Products (corrected and enlarged edition). Academic Press, New York (1980).