;,
-•
joumal of statJsti~ p~anning Journal of Statistical Planning and Inference 55 (1996)23-37
ELSEVIER
and inference
Bayesian prediction bounds for the Burr type XII distribution in the presence of outliers E s s a m K. A L - H u s s a i n i * , Z e i n h u m F. J a h e e n Department of Mathematics, University of Assiut, Assiut 71 516, Egypt
Received 24 August 1994; revised 14 May 1995
Abstract
Bayesian prediction bounds for some order statistics of future observations from the Burr (c, k) distribution are obtained in the presence of single outlier arising from different members of the same family of distributions. Single outliers of types kko and k + ko are considered and a bivariate prior density for c and k as suggested by AL-Hussaini and Jaheen (J. Statist. Comp. and Simulation 41 (1992)) is used.
1. Introduction
The two parameter Burr type XII (Burr(c, k)) distribution with probability density function (pdf) f ( x ) = ckxC-l(1 + xC) -(k+l),
X > 0, (C > 0, k > 0),
(1)
and distribution function
F(x) =
1 - (1 + xC) -k,
x > 0,
(2)
was first introduced in literature by Burr (1942). The Burr(c, k) distribution has been proposed as a lifetime model, and its properties have been studied by Dubey (1972, 1973), Burr and Cislak (1968), Rodriguez (1977), Tadikamalla (1980) and Lewis (1981), among others. Lewis (1981) proposed the use of the Burr(c, k) distribution as a model in accelerated life test data representing times to breakdown of an insulating fluid. Inferences based on the Burr(c, k) distribution and some of its testing measures were made by Papadopoulos (1978), Evans and Ragab (1983), Lingappaiah (1983), ALHussaini et al. (1992), Shah and Gokhale (1993) and AL-Hussaini and Jaheen (1992, 1994). Khan and Khan (1987) and AL-Hussaini (1991) characterized the Burr type XII * Corresponding author. 0378-3758/96/5;15.00 © 1996 Elsevier ScienceB.V. All rights reserved SSD1 0 3 7 8 - 3 7 5 8 ( 9 5 ) 0 0 1 8 4 - 0
24
E.K. AL-Hussaini, Z.F. Jaheen/Journal of Statistical Planning and Inference 55 (1996.) 23-37
distribution. Nigm (1988) considered Bayesian prediction bounds for the order statistics in a future sample. He used a bivariate prior that is discrete in c and continuous in k. AL-Hussaini and Jaheen (1995) obtained Bayesian prediction bounds for order statistics based on one sample and a series of M + 1 independent samples of future observations. They used a bivariate prior that is continuous in both variables c and k. An outlier in a set of data is an observation, or subset of observations, which appears to be inconsistent with the remainder of that set of data (see Barnett and Lewis, 1984). In this paper, we shall be concerned with the problem of obtaining Bayesian prediction bounds for the future observations from the Burr(c, k) population in the presence of outliers, assuming that both of the parameters c and k are unknown. The uncertainty about the true values of (c, k) is measured by the bivariate prior density function suggested by AL-Hussaini and Jaheen (1992).
2. Prediction of the future observations in the presence of outliers
Suppose that XI, X2 . . . . , X, is a random sample of size n drawn from a population whose pdf is Burr(c, k), defined by (1), and that Y1, Y2. . . . , Ymis a second independent random sample (of size m) of future observations from the same distribution. Bayesian prediction bounds are obtained for some order statistics of the future observations Y1, Y2 . . . . , Ym in the presence of a single outlier, arising from a different member of the same family of distributions. For the future sample of size m, let Y~ denote the sth ordered lifetime, 1 ~ s ~< m. For a given 2, the density function of Y~in the presence of a single outlier is of the form h(y~l ~.) = D(s)[(s - 1)F~-2(1 - F ) m - s F * f + F~- 1(1 - F)m-~f * + (m - s)F s-~ (1 - F)'n-s- 1(1 -- F * ) f ] ,
(3)
where D(s, = (ms 2 ~ ) ,
,4)
f = f ( y ~ 12) and F = F(y~ 12) are the density and distribution functions of all y's which are not outliers while f * = f * ( y ~ 12) and F* =_ F*(y~ 12) are those of an outlier (see
Balakrishnan and Ambagabpitiya, 1988). The functions f * and F* are obtained, for the Burr(c, k) model, by replacing the parameter k by kko or k + ko depending on the type of outlier. 2.1. Outliers o f type kko
The density function of Y~, in the presence of a single outlier of type kko, in the Burr(c, k) case may be obtained by substituting (1) and (2) for l a n d F in (3). The values
E.K. AL-HussainL ZF. Jaheen/Journal of Statistical Planning and lnference 55 (1996) 23-37
25
of f * and F* are those given by (1) and (2) after replacing k by kko. This density function can be simplified to be of the form h l ( y , lc, k) = D(s)ckdp(ys; c)(1 + y ~ ) - k t ' - s + l ) { ( m x
[1 -- (1
+
+ ko - s)(1 + y~)-ktk.-l)
yCs)-k]s-1 + (S -- 1)[1 -- (1 + y~)-kk*][1 -- (1 + y~)-k],-:},
y~ > o,
(5)
where ~b(y~; c) = y~/(1 + y~).
(6)
By expanding [1 - (1 + y ~ ) - , ] s - 1 and [1 - (1 + y~)-k]~-2, using the binomial expansion, (assuming nonnegative exponents), the density function (5) may be written as hl(y~lc, k) = D(s)ckdp(y~; c) (m + ko - s) ~ A l j ( y , ) + (s -
A2j(y~) , y s > 0 ,
j=O
(7)
where A l j ( y s ) = alj(s)exp [ - k(m + ko - s +J)ff(Ys; c)], A2j(Ys) = a 2 j ( s ) { e x p [ - k ( m
(8)
- s + j + 1)~k(ys; c)]
- e x p [ - k ( m + ko - s + j + 1)@(y~; c)]}, ¢(y,; c) = In (1 + y~),
(9) (10)
and for l -- 1, 2, au(s)= ( - 1 ) J ( s - l ) j .
(11)
For the first r ordered failure times from the first sample of size n obtained from the Burr(c, k) population, the likelihood function is given, using (1) and (2), by L(c, k l x )
_
(n ~ 1 -.n! r)
[1
n~ - -
-- F(xr)] "-r i=1 h f(xi) (12)
crk'v(c;x)e -kT,
(n - r) r
where x = ( x l , x 2 .... ,x,),
( xV 1 "~
v(c;
= ,=,f]t
)'
(13)
26
E.K. AL-Hussaini, Z.F. Jaheen/ Journal of Statistical Planning and Inference 55 (1996) 23-37
and
T - T(c;x) = )., ln(1 + x~) + (n - r)ln(1 + x~).
(14)
i=1
To obtain the joint posterior density of c and k, we use a bivariate prior density, suggested by AL-Hussaini and Jaheen (1992), which is given by
g(c,k)=Ac~+~k~exp{-c[~+~l
},
c>0,
k>0,
(15)
where e > - 1, fl, ~ and 6 are positive real numbers and A -I = F(f)F(a + 1)fl~?"+1. This prior is obtained by writing
g(c, k) = 01 (klc)gz(c), where COt+ l
gl(klc)-
r ( a + 1)~" + 1
k~e -kc/~ (~ > - 1, 7 > 0),
(15a)
is the gamma conjugate prior, that was first used by Papadopoulos (1978) and AL-Hussaini et al. (1992), when c was assumed known, and 1 9z(C) -- F"~'"~lO)p c~- le-¢/t~' (fl > 0, 6 > 0),
(15b)
is the gamma density function. Multiplying gl (klc) by g2(c), we obtain the bivariate density of c and k, given by (15). It may be remarked that the four-parameter gamma-gamma prior (15) is so chosen that it would be rich enough to cover the prior belief of the experimenter. It follows, from (12) and (15), that the joint posterior density of c and k is given by
q(c, klx)=Bcr+~+~U+%(c;x)exp
-- k T +
+
, c>O, k>O,
(16)
where
B -1 = F ( r + c t +
1)
cr+~+%(c;x)
T+-
exp -
dc.
(17)
Following Aitcheson and Dunsmore (1975), the Bayes predictive density function of y given x, denoted by p(ylx), is defined by
p(y Ix) = fA h(yl2)q(2 Ix)d2,
(18)
E.K. AL-HussainL ZF. Jaheen/Journal of Statistical Planning and Inference 55 H996) 23-37
27
where h(y[ 2) and q(2 Ix) denote the population and posterior densities, respectively. We shall write p~ (y Ix) and P2(Y Ix) for the predictive densities in the presence of an outlier of type kko and of type (k + ko), respectively. Substituting (7) and (16) in (18), the predictive density of y, given x, in the presence of a single outlier of type kko, is then given by pl(y~lx) = fo'~ ~o ~ hx(y~lc, k)q(c, k l x ) d k d c
BD(s)F(~)
c"dp(y~; c)v(c; x ) e x p
I
-
s-1
x (m + ko - s) ~ alj(s)[vlj(y~; c)] -~
j=O
}
+ (s -- 1) ~ a2j(s){[vzj(y~; c)] -¢ -- [v3j(Y~; c)]-~} dc, j=O
(19)
where {=r+~+2,
q=r+~+6+l,
vlj(y~; c) = (m + ko - s +j)~b(y~; c) + T + ely, v2j(y~; c) = (m - s + j + 1)ff(y~; c) + T + c/y,
(20)
V3j(Ys', C) ----- (m + ko - s + j + 1)ff(y~; c) + T + c/7.
Substituting the value of B, given by (17), in (19), we obtain Pl(y~lx) =
O(s)(~
-
~o(~
1) {
~- 1
(m + ko - s) ~ alj(S)Ii~(y~;x)
s2
j=0
+ (s -- 1) ~ azj(s)[Iz~(ys;x) -- I3j(y,;x)] j=O
},
(21)
where Io(x) =
c"-lv(c;x
T +
exp -
dc,
(22)
and, for i = 1, 2, 3, Iii(y~;x) =
c~(a(ys; c)v(c;x)[vij(ys; c)]-¢exp
_c
dc.
(23)
28
E.K. AL-Hussaini, ZF. Jaheen/Journal of Statistical Planning and Inference 55 (1996) 23-37
Prediction bounds for Ys are obtained by evaluating P[Ys>~ 0ix], for some positive 0. It follows from (21) that
P[Y~ >~ 0 Ix] --
pl(y~lx)dy~
j=O
a'J(s)
(
+(s-1) j~o.=az~(S) m - - s + ) + l
) m+ko-s+j+i
'
(24)
where, for i = 1, 2, 3,
I*(0; x) =
c"- 'v(c; x)[vu(0; c)] - (¢- 1)exp -
dc,
(25)
and vu(0; c) are as given by (20) with y~ being replaced by 0. In the Appendix, we confirm that Pl(Ys Ix) is a density function on the positive half of the real line, by proving that P[Y~ >1 0 Ix] = 1. A 100~% Bayesian prediction interval for Y~ is such that
P[L(x) < Ys < U(x)] = z,
(26)
where L(x) and U(x) are the lower and upper limits satisfying
PEYs > L(x)Ix] = (1 + z)/2, and
PEY, > U(x)lx]
-- (1 -
z)/2.
(27)
2.2.1. Special Cases (i) Prediction of Y1. To predict the minimum observation Yt, which represents the first failure time in a future sample of size m, in the presence of an outlier of type kko, we set s = 1 i n ( 2 4 ) , so that
P[Yx >t 01[xl
I'~o(01 ; x) , Io(x)
(28)
where
I*o(01;x) =
L oc"-Iv(c;x)[v,o(O,; c)]-(~- ~)expF--cl dc, L PJ
v(c;x) is given by (13), Vlo(01; c) = (m + ko - s)~b(01; c) + T + c/y, ~k(01; c) = ln(1 + 0~) and Io(x) is as given by (22). A 100z% Bayesian prediction interval for Y1 can thus be obtained by equating (28) to (1 + z)/2 and (1 - z)/2 and solving the resulting equations, numerically, for L(x) and U(x), the lower and upper bounds of the interval.
E.K. AL-Hussaini,Z.F. Jaheen/ Journal of StatisticalPlanningand Inference55 (1996)23-37
29
(ii) Prediction of Y,,. When s = m, prediction will be for the m a x i m u m I1=, which represents the last failure time in a future sample of size m. Substituting s = m in (24), we have
PUY= >1021xl -/~
1{
[,kko+j 02;x!]J
ko E alj(m) L j=o
m-2 F'~-j(O2;X) I~j(O2;X)-~ , +(m--1) j~=oaZj(m) L j + l ko+j+ 1])
(29)
where, for i = 1,2, 3, I*(02;x) are as given by (25) with 0 being replaced by 02. F r o m (29), 100z% Bayesian prediction bounds for Ym can be obtained. Example 1. For given values (fl = 6.0, 6 = 1.0), generate c = 4.358 from the g a m m a density, given by (15b), and for given values (~ = 4.0, ~ = 5.0, c = 4.358), generate k = 3.987 from the conditional g a m m a density, given by (15a). The I M S L (1984) is used in the generation of the g a m m a random variates. The generated values (c = 4.358, k = 3.987) are considered as the true values of the population parameters. A r a n d o m sample of size 20 is then generated from a Burr(c = 4.358, k = 3.987) distribution. It is assumed that only the first 15 ordered failure times are available and that they are given, in ordered form as follows: 0.3085, 0.3509, 0.4508, 0.5401, 0.5545, 0.5564, 0.5753, 0.6522 0.6692, 0.6817, 0.6999, 0.7926, 0.7992, 0.8807, 0.8964. Suppose that we have another sample of size 20 in the presence of an outlier of type kko. For given values of ko we want to get 95°/'0 prediction bounds for Yl and Y2o, the minimum and m a x i m u m of the failure times in the future sample. These bounds with the corresponding values of ko are given in Table 1.
Table 1 95% Bayesian prediction bounds for Yt and Y2o in the presence of a single outlier of type kko Y1
Y2o
ko
LL
UL
Length
LL
1.0 2.0 3.0 4.0
0.1359 0.1341 0.1324 0.1307
0.5288 0.5229 0.5174 0.5122
0.3929 0.3888 0.3850 0.3814
0.8909 1.8086 0.9176 0.8855 1.7967 0.9112 0.8847 1.7955 0.9108 0.8845 1.7936 0.9091
LL: Lower limit of the interval. UL: Upper limit of the interval.
UL
Length
30
E.K. AL-Hussaini, ZF. Jaheen/Journal of Statistical Planning and Inference 55 (1996) 23-37
2.2. Outliers of type k + ko
The density function of y~, in the presence of a single oulier of type k + ko, in the Burr(c, k) case, may be obtained by substituting (1) and (2) for f and F in (3). The values of f * and F* are those given by (1) and (2) after replacing k by k + ko. This density takes the form hE(ys [c, k) = O(s)cc~(y~; c)(1 + y~)-k("-~+l){[k(m -- s + 1) + ko](1 + y~)-ko
X[1 --(1 + yC)-ky-~ + k ( s - - 1)[1 + y~)-(*+ko~][1 --(1 + y ~ ) - k y
2}, (30)
where O(s) and ~b(y~; c) are as given by (4) and (6), respectively. By using the binomial expansion as before, the density function (30) may be written in the form hE(y~lc, k) = D(s)c4)(y~; c) {Ek(m - s + 1) + ko]
~
~-2 } BIj(y~) + k ( s - 1) ~ B2j(y~) ,
j=o
j=o
(31) where BIj(Y~) = a l j ( s ) e x p [ - { k ( m - s + j + 1) + ko} ~9(y~; c)],
(32)
B2~(y~) = a2j(s) { e x p [ - k ( m - s + j + 1)O(y~; c)] -
-
exp[-{k(m-
s + j + 2) + ko}0(y~;c)]},
(33)
ali(s), azj(s) and 0(Y~; c) are given, respectively by (11) and (10).
The Bayes predictive density function of y~ given x is obtained by substituting (31) and (16) in (18) to yield P2(YsIX) = / ~
c"~(y~; c)v(c;x)exp
-
re(c; ys,x)dc,
(34)
where Io(x) is as given by (22), co(c; y~,x) s--1
= ~ ali(s)exp[-koO(y~; c)] [v2~(y~; c)]-¢{(~ - 1)(m - s + 1) + kovzj(yA c)} j=o s--2
+ (s -- 1)({ -- 1) ~ azj(s){[Vz1(ys; c)] -¢ -- exp [-ko~k(ys; c)][v,i(y~; c)]-~}, 1=o
(35)
E.K. AL-Hussaini, ZF. Jaheen/Journal of Statistical Planning and Inference 55 (1996) 23-37
31
V2j(Ys; C) is as given in (20) and C
V4j(Ys; C) =- (m - s + j + 2)O(ys; c) + T + -
=
7
v2j(Ys; C) + ¢(ys; C).
(36)
It follows from (34) that cx?
P U L >1 o Ix] =
L
p2(y~lx)dy~
D(s)
c"
v(c,x)exp
-
Hs(c; O,x)dc,
(37)
= ~ alj(S)G,j(c; O,s) + (s - 1) ~ a2~(s)Gzj(C; O,s),
(38)
ko/o(X)
where ~
Hs(c; O,x) = c
L
co(c; ys,x)c~(ys; c)dys
s-1
s
j=o
2
j=o
I ( c,)l
Glj(c; O,s) = ej(s)exp c~j(s) T +
{(~ - 1)(m - s + 1)I** + koI2j * },
(39)
(40) ~j(S) --
ko m--s+j+l'
/~;(S) =
I** = l l j (c, O,x) = **
.
.
(41)
z-~expE-~j(s)z]dz, f v °v 2j (0; C, S)
I~* = I2j (c, O,x) = **
ko m--s+j+2'
z-(¢- l)exp[-o~j(s)z] dz,
(42)
f v °° 2j(O;C,S)
I~* - I**(c; O,x) =
z - ~ e x p [ - f l j ( s ) z ] dz. 4j(0; c, s)
Integrating by parts, the integral I**, in (42), can be written in terms of I** as follows:
o~(s)I** = [v2j(0; c,s)] -(¢- 1)exp [--otj(s)v2j(O; c,s)] - (( - 1)I**.
(43)
Furthermore, it m a y be noted, from (41), that for all j, flj(s) = c9+1(s ) and, from (36) and (20), v4j(O; c, s) = v2, j+ ~(0; c, s), so that I** = 11, j + 1. Substituting (43) in (39) and
32
E.K. AL-HussainL Z.F. Jaheen/Journal of Statistical Planning and Inference 55 (1996) 23-37
aj+l(s), Ii,j+ ** 1, for fl~(s) and Iaj ** in (40), it can be shown that the functions Gls and G2j can be rewritten in the forms
a lj(C; O, S) = ko[v2j(O; c,s)]-t¢-l)exp[-kAb(O; c)] + (3 - 1)Ira - s + 1)ai(s) - ko]
[(
xexp c~i(s) T +
llj , ;)1"*
(44)
G2j(C; O, s)
1)__(~ --
=~j(S)[V2j(O.c,s)]-(¢
l)aj+l(s)exp
j+I(S)
"[-
11,j+1"
Also, in the appendix, we confirm that P2 (Ys Ix) is a density function on the positive half of the real line, by proving that P [ Ys >~ 0 [x] = 1.
2.2.1. Special cases (i) Prediction of Y1. By setting s = 1 in (37), we obtain 1 P[Y1 /> 03 ix] - koIo(x~)
;oo
c._lv(c;x)e_ClPtil(c;O3,x)dc
'
where, from (38), (44) and (45), Hi (c; 03,x) = ko [V2o(03; c)] - ¢¢- 1) exp [ - ko~ (03; c)], V2o(03; c) --- m~/(03; c) + T + c/7,
to(X) is as given by (22), r/and ¢ by (20), ~b(03; c) by (10) and v(c;x) by (13). (ii) Prediction of Ym. By setting s = m in (37), we obtain P[Ym >~ 04Ix] = kolo(x----~)
c'-lv(c;x)e-ClPH.,(c; 04,x)dc,
where, from (38), (20) and (36),
Hr. (c; 04, X) = k o [~"-1 j~o alj(m)[[v2j(04; c)] -t¢-1' exp[-ko~s(04; c)]
7~-1-)exPL7si--]-~)]/'*?] 2
[(
+ (m -- 1) j~=0 a2j(m )
f~ -- 1"~ [-ko(r - tj- jexp L-
1 )
[v2j(Oa;c,m)] -(¢-1) +
~)7 ,.7]
v2j(O,,; c) = (j + 1)~s(04; c) + T + c/y,
E.K. AL-HussainL Z.F. Jaheen/Journal of Statistical Planning and Inference 55 (1996) 23-37
33
Table 2 95 % Bayesian prediction bounds for Y1 and 1:20 in the presence of a single outlier of type k + ko Y1
1:20
ko
LL
UL
Length
LL
UL
Length
0.0 1.0 2.0 3.0
0.1359 0.1357 0.1348 0.1334
0.5288 0.5277 0.5212 0.5175
0.3929 0.3920 0.3864 0.3841
0.8909 0.8878 0.8834 0.8822
1.8086 1.8051 1.7995 1.7980
0.9176 0.9173 0.9161 0.9158
LL: Lower limit of the interval. UL: Upper limit of the interval.
and
v4j(04; c) = ( j +
2)~0(04; c) + T + c/~.
Example 2. By assuming the same set of data considered in Example 1, Bayesian prediction bounds for y~ and y2o, the minimum and maximum of the failure times in the future sample of size 20 in the presence of a single outlier of type k + ko are obtained. These bounds with the corresponding values of ko are given in Table 2.
Concluding remarks In this paper, single outliers of types kko and k + ko are studied. Bayesian prediction bounds of the future observations in the homogeneous case (the no outliers case) can be obtained by setting ko = 1 in (24) or ko = 0 in (37), leading to some results of AL-Hussaini and Jaheen (1995). The first line in Tables 1 and 2 thus coincide. In such sense, some of the results presented in this paper generalize, in the two-sample case, the corresponding results obtained in the above cited reference. It can be easily shown that, for fixed values of c, the variance of the Burr(c, k) distribution tends to zero as k tends to infinity. This implies that as k gets larger, the observations concentrate on a shorter domain since the curve of the density function gets more peaked. It then follows that the lengths of the predictive intervals decrease as ko increases. Obviously, the lengths of the intervals in the kko case should, generally, be shorter than in the (k + ko) case as ko increases.
34
E.K. AL-HussainL Z.F. Jaheen/Journal of Statistical Planning and Inference 55 (1996) 23-37
Appendix 1. To show that pl(y~ Ix) is a density function on the positive half of the real line, we prove that PlY, ~> 0Ix] = 1. It follows, from (24), that P[Ys>~OIx] s-2
=D(s) ( m + k o - s )
+ (s-1)
alj(s) m + k o - - S + j=O
Z
--s+j+l
~ a2j(s) j=O
m+ko-s+j+
l
-
(A.I)
.
By using the following fact (see, e.g., Lingappaiah, 1981), ~o ( - 1)j
(J
-[- ~ ) - 1
_
[I~.=o(j+e)
-
e+n
(n+l)(,+l)
(A.2)
'
and the values of alj(s) and aEj(S), given by (11), it follows that
s--l(lj) =~o
j=
alj(S) m + ko
-
-
s+
s--l()
=
j=~o
(-- 1)j s -- 1 + (m + ko - s)]-I J [J 1
s(m+ k:- :) ~-z ( 1 ) ~=0 aEj(s) m - - s + j + 1 -j.=
(S --
1 1 = -1)(s_ lm-1) (s 1)D(s)'
(using (4)),
and
j=0
aEi(S) m + ko - s + j + i
= (s
-
1)(m+-k°11) '
Substituting in (A.1), it follows that D P[Ys>~OIx] =
[I,m+k°-Y)s-1
f m + ko - s 1 (s)~s(W+T2--r) + D(s--)
1
J = 1,
since the first and third terms are equal. 2. To show that P2 ( Y s IX) is a density function on the positive half of the real line, it follows from (37), that D(s) P[Ys >~01x] - kolo(x)
c" %(c;x)exp
--
HAc; 0,x)dc,
(A.3)
HAc; O,x) =- ~' alj(s)Gij(c; O,s) + (s -- 1) ~ aEj(S)Gzj(C; O,s).
(A.4)
where, from (38), s--1
j=0
s
j=O
2
E.K. AL-Hussaini, Z.F. Jaheen /Journal of Statistical Planning and Inference 55 (1996) 23-37
35
From (44) and (45),
Glj(c;O,s)=ko T +
+(~-l)[(m-s
+ 1)c~j(s)-ko]
xexp[~j(s)(T+~)lI**,
[
(A.5)
- ( ~ - l)ej+l(s)exp
G2j(c; 0,s) = ej(s) T +
[ (
aj+l(s) T + L ] j , I , j + I . (A.6)
From (42),
I** - I U (c, O,x) = z-~exp[-Tj(s)z] dz, ** . fT ~+cl7
(A.7)
~j(s) and flj(s) are given by (41). In (A.5), we observe that
[(m-s+l)ej(s)-k°]=[
( m - s + l ) 1) o - k o ] -jko
= (m -- s + j + 1) = --jaj(s). Combining terms, it follows from (A.4)-(A.6) that
Hs(c; O,x) = [ T + ~]-(¢- I)HI~(C,X) - (~ -1)H2s(C; X),
(A.8)
where s-1
s-2
Hls(GX) = ko Z axj(S)+ ( s -
1)
j=O
~
a2j(S)o~j(s),
(A.9)
j=O
H2s(c;x) = ~', jalj(s)ej(s)exp cg(s) T +
I~*
j=0
+(s-l)
~ aej(S)ej+,(s)exp Cg+l(S) T + 7 ] j , t , j + l . j=O
Since, by taking i = j + 1, s-2
j-o
s--1
a2j(s)o~j(s) = ~ a2., l(s)ai-l(s), i=J
Z a2.i(s)aj+,(s)exp
j=O
o~j+l(S )
T
+ .~JJ'~l,J+l
( T+ = ,~ a2.i-l(s)ei(s)exp [~i(s) i=l
I**,
(A.10)
36
E.K. AL-HussainL Z.F. Jaheen /Journal of Statistical Planning and Inference 55 (1996) 23-37
and, from (11), we observe that (A.11) It follows, from (A.9) and (A.11), that
I H15(c;x)=ko
5--1 1+
I =ko
j=l s--1
al j(s)
- (s -
al j(S)O~j- 1 (s)
.=
1+~ j=l
=ko
1+
j=l a~j(s) m Z s
s l
(ms)
= ko ~, ali(s) m - - s S r j j=O
, (aio(S)
=
1)
= ko(m - s) j~=o .= ( - 1)j s -j 1 ( J + m - s ) - i 1 =
ko(m
-- s)
1
S(ms 1) -- ko (sm_~ll)
ko D(s) ' H25(c;x) = j=l 2 al~(s)[j~j(s)] exp ej(s) T + c'~7I** ~)JJ 1J
+ (s - 1) 2 a2,~-~(s)ej(s)exp ~j(s) T +
I**
j=l
= ~ [jalj(s) + ( s - 1)e2,j_l(s)]aj(s)exp
ej(s) T +
I**,
j=l
and from (11), it follows that jal.i(s) + (s - 1)a2,j_ l(s) = 0.
Therefore, H2,(c; x) = 0. Hence, Hs(c; O,x) = - ~ )
T +
.
Substituting (A.12) in (A.3), it follows that P[Y~ ~> Olx] = 1.
(A.12)
E.tL AL-HussainL ZF. Jaheen /Journal o f Statistical Planning and Inference 55 (1996) 23-37
37
References Aitchison, J. and I.R. Dunsmore (1975). Statistical Prediction Analysis. Cambridge University Press, Cambridge. AL-Hussaini E.K. (1991). A characterization of the Burr type XII distribution. Appl. Math. Lett. 4 (1), 59-61. AL-Hussaini E.K. and Z.F. Jaheen (1992). Bayesian estimation of the parameters, reliability and failure rate functions of the Burr type XII failure model. J. Statist. Comput. Simulation 41, 31-40. AL-Hussaini E.K. and Z.F. Jaheen (1994). Approximate Bayes estimators applied to the Burr model. Comm. Statist. Theory Methods 23 (1), 99-121. AL-Hussaini E.K. and Z.F. Jaheen (1995). Bayesian prediction bounds for the Burr type XII failure model. Comm. Statist. Theory Methods 24 (7) 1829-1842 AL-Hussaini E.K., M.A. Moussa and Z.F. Jaheen (1992). Estimation under the Burr type XII failure model: A comparative study. Test 1, 33-42. Balakrishnan N. and R.S. Ambagaspitiya (1988). Relationships among moments of order statistics in samples from two related outlier models and some applications. Comm. Statist. Theory Methods 17, 897-911. Barnett, V. and T. Lewis (1984). Outliers in Statistical Data. Wiley, New York. Burr, I.W. (1942). Cumulative frequency functions. Ann. Math. Statist. 13, 215 232. Burr, I.W. and P.J. Cislak (1968). On a general system of distributions: I. Its curve shaped characteristics; II. The sample median. J. Amer. Statist. Assoc. 63, 627-635. Dubey, S.D. (1972). Statistical contributions to reliability engineering. ARL TR 72-0120, AD 1751261. Dubey S.D. (1973). Statistical treatment of certain life testing and reliability problems. ARL TR 73-0155, AD 1774537. Evans, I.G. and A.S. Ragab (1983). Bayesian inferences given a type 2 censored sample from Burr distribution. Comm. Statist. Theory Methods 12, 1569-1580. Evans, R.A. and G. Simons (1979). Research on statistical procedures in reliability engineering. ARL TR 750154, AD A013687. Khan, A.H. and A.I. Khan (1987). Moments of order statistics from Burr's distribution and its characterization. Metron 45, 21-29. Lewis, A.W. (1981). The Burr distribution as a general parametric family in survivorship and realiability theory applications, Ph.D. thesis, Department of Biostatistics, University of North Carolina. Lingappaiah, G.S. (1981). Sequential life testing with spacings, exponential model. IEEE Trans. Rel. R-30 4, 370-374. Lingappaiah, G.S. (1983). Bayesian approach to the estimation of parameters in the Burr's distribution with outliers. J. Orissa Math. Soc. 1, 55-59. Nigm, A.M. (1988). Prediction bounds for the Burr model. Comm. Statist. Theory Methods 17, 287-296. Papadopoulos, A.S. (1978). The Burr distribution as a failure model from a Bayesian approach. IEEE Trans. Rel. R-27 5, 369-371. Rodriguez, R.N. (1977). A guide to the Burr type XII distributions. Biometrika 64 (1), 129-134. Shah, A. and D.V. Gokhale (1993). On maximum product of spacings (MPS) estimation for Burr XI! distributions. Commun. Statist.-Theor. Meth., 22 (3), 615 641. Tadikamalla, P.R. (1980). A look at the Burr and related distributions. Inter. Statist. Rev., 48, 337 344. IMSL Reference Manual (1984). IMSL, Inc., Houston, TX.