Superposition of renewal processes in a random environment

Superposition of renewal processes in a random environment

STATISTICS& PROBABILITY LETTERS ELSEVIER Statistics & Probability Letters 35 (1997) 297-305 Superposition of renewal processes in a random environme...

405KB Sizes 0 Downloads 36 Views

STATISTICS& PROBABILITY LETTERS ELSEVIER

Statistics & Probability Letters 35 (1997) 297-305

Superposition of renewal processes in a random environment Linxiong Li * Department of Mathematics, University of New Orleans, New Orleans, LA 70148, USA

Abstract A random environment is modeled by an arbitrary stochastic process, the future of which is described by a a-algebra. Baxter and Li (1994) discussed generalizations of standard limit theorems of renewal theory to the case where a random environment is involved. In this paper, we consider superposition of renewal processes in a random environment. In particular, the central limit theorem and other limiting properties of superposition of renewal processes in a random environment are derived. (~) 1997 Elsevier Science B.V. Keywords: Central limit theorem; Residual life; Current life; a-algebra; Reverse martingale

1. Introduction Superposition of renewal processes has many applications in various areas. For example, consider a repairable series system that consists o f m independently functioning components. The system is operating if and only if all the components are operating. Assume that the components and the system are repairable in the sense that, upon failure, the failed component is immediately replaced by a new one and the system is restored to operation. If one's interest is the number of failures in [0, t], then the number of failures o f component i, 1 <~i<~m, comprises a renewal process and the number o f failures of the system is modeled by the superposition of the renewal processes of the components. Stochastic processes in random environments have been considered by many authors. The definitions o f random environments are, however, different from one stochastic process to another. For random walks, Markov processes, or branching processes, it is usually assumed that the parameters o f the corresponding process constitute a stochastic process, reflecting the variability of model parameters. For renewal processes which do not naturally exhibit parameters, the random environment is modeled by a collection of o--algebras. Kijima and Sumita (1986) studied generalizations of renewal processes, focusing on the uniqueness o f the solutions of certain integral equations. Baxter and Li (1994) generalized many standard limit theorems o f renewal theory, including the key renewal theorem and Blackwell's theorem, to the case where the environment is random. More related references can be found in Baxter and Li (1994). Superposition of renewal processes

* Tel.: + 1 504 280 6040; fax: + 1 504 280 5516; e-mail: [email protected]. 0167-7152/97/$17.00 (~) 1997 Elsevier Science B.V. All rights reserved PH SO 167-71 52(97 )00026-6

298

L. Li / Statistics & Probability Letters 35 (1997) 297-305

is discussed by Teresa Lam and Lehoczky (1991) but discussions of superposition of renewal processes in a random environment have not been seen in the literature, to the best of our knowledge. In this paper we derive limiting properties of superposition of renewal processes in a random environment. In particular, Section 2 provides some fundamental assumptions and necessary definitions. In Section 3, limiting properties about the number of renewals and the distribution of residual life in a random environment are derived. The central limit theorem of the superposition of renewal processes given a random environment is presented in Section 4.

2. Notations and assumptions For each i, 1 <~i<~m, let 3(/1,)(/2,)(/3.... be a sequence of nonnegative, independent, identically distributed random variables on a probability space (I2,~¢,P) and be equal in distribution to a random variable X,.. n Define the nth partial sum Si, n = ~ j = l X/j, n ~> 1, writing Si, o = 0 almost surely. Further, define the counting function N i ( t ) = sup{n I Sin<~t}, t>>.O. Let {Z(t), t~>0} be a stochastic process on (g2,~C,P) which represents the environment of the renewal process {N(t), t ~>0}. We call {Z(t), t >~0} the environmental process. Rather than making use of the environmental process, it is more convenient to introduce a collection of a-algebras {o~t, t~>0} where ~t = a { Z ( s ) , t<~s0. For simplicity, we write ~ = ~ 0 . It is assumed that the m renewal processes {Ni(t), t >~0, 1 <~i <~m} are mutually independent and are also conditionally independent given the environment In addition, since virtually every mathematical statement in this paper holds with probability one, we avoid irritating repetition by suppressing the qualifier "almost surely" throughout. To obviate uninteresting technical difficulties, we assume that there is a version of the conditional distribution of X/ given ~t which is absolutely continuous for all t/>0. The distribution function of a random variable, say X, is denoted Fx. We write Fx = 1 - Fx and a version of the distribution function of X conditional on the a-algebra f¢ is denoted Fx(. I f~). We now state and discuss our main assumptions. A1: There exist nonnegative random variables Q and D on (t2,~C,P) such that E[D] < ~ P { X > s I ~ t } < ~ P { D > s } for all s,t>.O, i-- 1,2 . . . . . m.

and P { Q > s } <~

Remark. It follows immediately from Al that E[X/] 0} comprises a reverse martingale and hence (e.g. Doob, 1953, p.331 ) there exists a random variable Y/on (O, d , P ) such that Fr~(x)= P{Yt <<,x} = P { X < ~ x l ~ o o } = l i m t - - . ~ P { X , . ~ x l ~ } for all x, where ~ = lim/__.~ ~tt. A2: The distribution function, Fr,, of Y/ is continuous and Y/ is not bounded in probability, i = 1,2 . . . . . m. A3: The distribution of X/n depends on the history of {Ni(t), t~>0} and on {Z(t), t~>0} only through Si,~n-1) and the environment thereafter, i.e. P{X~'n <~tl~,Si,~ = S l , & 2 =sz . . . . . S/,(n_l) =Sn_ 1} = P{X~., ~
for all n ~>1 and t/> 0. Lastly, we present a generalization of the Helly-Bray theorem which may be proved by a straightforward modification to the usual proof (e.g. Chow and Teicher, 1978, p.251).

Lemma 2.1. Let {W~(t)} be a sequence o f finite measures on [0,b] and let W(t) be a continuous, finite measure on [0,b] such that Wn(t) converges weakly to W(t) as n--~ c~. Further, let 9n : [0,b] × [al,a2]---~R, n = 1,2, 3 . . . . . be a sequence o f continuous functions which converoes uniformly to 9 : [0, b] x [al, a2] ~ R as

L. Li / Statistics & Probability Letters 35 (1997) 297-305

299

n ---*cxD. Then

iox g,(t,u)dW~(t)

~

io x g(t,u)dW(t),

uniformly in (x,u)E[O,b] × [aha2], as n~cx~.

3. Residual and current lives For each i, by Assumption Al, the expected value of Yi, #i, is finite, i--- 1,2,...,m. Let N ( t ) = ~m=l Ni(t), ni(t I ~- ) =E[Ni(t) l ~ ] and n ( t [ ~ ) =E[N(t) l ~ ]. Let 7i(t) = Si,Ni(t)+l - t (6i(t) = t--Si, N,(t)) be the residual (current) life of the renewal process {Ni(t), t >>.0} at time t. Similarly, let y(t) (6(t)) be the time of observing next (last) event of the superposition process {N(t), t>~0} after (before) time t. For more discussions of residual and current lives see, for example, Karlin and Taylor (1975, Ch. 5). Further, let y/e denote a random variable having distribution function s

FU(s) = #~

Fr,(X) dx.

For a single renewal process, Baxter and Li (1994) have proved the following theorem based on Assumptions A1-A3.

Theorem 3.1. For any i = 1,2 ..... m, (a) (b)

lim H i ( t l ~ ) = 1 " t---~c,o t fli ' lim [Hi(t -}- h [ ~ ) - H;(t [ ~ ) ] = h , h>~0; t~oc

Pi

lim P{fi(t)<.s 1~} : F u ( s ). (c) t~oolimP{Ti(t)<<.s]~}= t---*oo Define fl(t)= y(t)+ 6(t), the total life covering time t of the superposition process {N(t), t >t0}. We obtain

Theorem 3.2. (a) lim H ( t [ ~ ) _ L t--*~

(b)

t

1

def

i=l #i

]2--1;

lim [H(t + h [ ~ ) - H ( t ] ~ ) ] = - ; t --"* 0~3

h j[~ m

(c)

lim P{y(t)<.s 1~-} = lim P{6(t)<.s I ~ } = 1 - H F u ( s ) ;

/----40<3

t " ' * OO

i=1

(d) t---,oolimP { f l ( t ) ~ s l ~ } = -

~i f

~ xdG(x),

where G(x) = 1 - # ~iml ~l --- yie(X )" Fr,(x ) H j m =I,j ¢ i F In particular, when m = 1, (d) reduces to

lim P { f l ( t ) > ~ s l Y } = l

t---+ O(9

fs~xdFr,(x).

L. Li / Statistics & Probability Letters 35 (1997) 297-305

300

Proof. (a) and (b) are straightforward extensions of Theorem 3.1(a) and (b), respectively, and thus the proofs are omitted. (c) Since the renewal processes are conditionally independent, P{7(t)>~slo~ } = ~im=l P{Ti(t)>~slo~ } and P{fi(t) >~s[~} = I'Iim__l P{fii(t)/>sl~}. Thus (c) is true by letting t ~ c~ and observing Theorem 3.1 (c). (d) For each i, let Y/j, j = 1,2 ..... are independent, identically distributed random variables equal in distribution to Y~-,i = 1,2 ..... m. Define a renewal process Nr,(t)= sup{n ] ~-~= l Yij <<.t}, t>~O, and superposition process {Nr(t), t>~0} where N r ( t ) = ~-~4m=lNY,(t). Further, let 7r(t),fir(t) and fir(t), which are associated with {Nr(t), t ~>0}, denote the notions similar to 7(t), fi(t) and fi(t), which are associated with {N(t), t ~>0}, respectively. A result of Teresa Lam and Lehoczky (1991) states that t--.~limP{fir(t) >~s} = ~1 fs ~ x d G( x ). Thus, it suffices to show that IP{fi(t) 1>s [ ~ } - P{fir(t)/>s}l ~ 0

(3.1)

for all s ~>0, as t ~ oc. As a matter of fact, we can prove a claim which is stronger than (3.1): the conditional joint distribution of (7(t), fi(t)) given ~ converges weakly to the limiting joint distribution of (yr(t), fir(t)) as t ~ oo. To see this, notice that

{y(t)>~x, fi(t)>>.y}={7(t- y)>~x+ y}

for all x,y,

and thus, by making use of (c) above, we have lim P{7(t)>>.x, fi(t)>~ ylo~ } = f i Fye(x + y).

t --* O~

(3.2)

i

i=1

Meanwhile, it is well-known that lim P{Tr(t)>jx, fir(t)>~ y} = f i F
which, in combination with (3.2), implies the claim and hence (3.1).

[]

The following theorem concerns the mean convergence of ?(t) and fi(t). In order to prove the theorem we need the following lemma (for a proof see, e.g. Chow and Teicher, 1978, p. 254). Lemma 3.3. I f nonnegative random variables {Xn} converge to X in distribution and E[X~] is a bounded sequence for some s > 0 , then E[X~] --~E[Xk], k = 1,2 .... [s], k 0 such that E[D2+~]
(a) E[?(t) I °~] ~ . ~ (b) E[fi(t) lo~ ] --+ m

fo 00--

Fr(x) dx, Fr(x) dx, Oo

m

where -fir =(1-/;=1 ft Fr,(x)dx)/~r and l~r = fo 1-Ii=~ft hand side of (a) and (b) is E[Y~]/2#I.

oo--

F~i(x)dxdt In particular, when m = 1, the right-

L. Li I Statistics & Probability Letters 35 (1997) 297-305

301

Proof. (a) We consider, firstly, the case m = 1. From Kujima and Sumita (1986) the derivative, denoted by hi(. [ ~-), of//1(. [ ~ ) exists and is bounded by a constant, say M. By Theorem 3.1 (c) and the preceding lemma, it is sufficient to show that E[711+~(t)] is bounded for all t~>0. Actually, E[yll+~(t) I ~ ] = E[ (X1 - t) T M ISl > t , ~ ] e {xl > t [ ~ } +

/o t E[(XI - ( t - y ) ) l + ~ l X l > t - y , ~ y ] P { X l > t - y l ~ y } d H l ( y [ ~ ) P{Xl>t-y+sl/(l+~)lO~y}hl(y[~)dsdy

=

p{x,>t+s'/('+~)l~}ds+

<<.

P{D>x}(l +~)(x-t)~dx +(l +~)M

l'f?

P{D>x}[x-(t- y)]~dxdy Y

letting t + s 1/(1+~)= x in the first integral and t - y + s 1/0+~) = x in the second one and then by A1 =

F

(1 + e ) P { D > s + t } s ~ d s + ( 1

+~)M

/0 L

P{D>x}(x-t+y)~dxdy Y

by change of variable. Since f0~( 1+~)P{D >s+t}s ~ ds ~
P { D > x } ( x - t + y ) ~ d x d y = (1 +c¢)M y

P{D>x}(x- t+y)~dydx x

= (1 + c~)Mfo ~ P { O > x } 1 +1 x 1+~dx _

1 aME[D2+~ ] 2+


Thus Eb, l~+"(t) I ~ ] is bounded for all t>~0. When m > 1,

P{7(t)>~sl~}<~P{~l(t)>~sl~} for all s. This implies that E[71+~(t)l~] ~0}. The distribution of SNI(t), the time of the last renewal prior time t, is given in Ross (1983, p.65) and can be straightforwardly extended to a version of conditional distribution given o~, i.e.,

P{SN,(,)>x[~}=P{XI>tlo~}+

1xP{X~>t- y[~y}dg~(ylo~ )

for t >~x >>-O. We now use (3.3) to prove (b). Consider, firstly, m = 1. We show that t >~0. In fact,

e[~l+=(t)15] =E[?+=lX~>t,o*]P{X~>tlo~}+ <. tl+~p{D>tl~.~} +

/ot( t -

(3.3)

E[~l+~(t) I~]

/o'

is bounded for all

E[(t-y)l+~lSNl(t)-=y,~]dFsN,(t,(y]~ )

y)l+~P{X~ > t - y l ~ } h l ( y l o ~ ) d y

302

L. Li / Statistics & Probability Letters 35 (1997) 297-305

by (3.3)

<~tl+~p{D>t} + M

/0'

(t - y)l+~P{D>t - y} dy

by As and the definition of M

<<,tl+~P{D>t} + 2---~ME[D2+=]. Since tl+~P{D>t} is bounded, E[6ll+=(t) [ ~ ] is also bounded for all t>~0. Thus (b) is true for m = 1. For m > 1, observing that P { f ( t ) ~ > s [ ~ } <~P{6~(t)>~sl~ } for all s, E[f~+~(t) l°J] is bounded for all t/> 0. This completes the proof of (b). []

4. The central limit theorem

For fixed i, consider a sequence of random variables y/e,Y/l, Y,2.... where Ire has distribution function

1/tzi fo-Fr,(t)dt. Thus this sequence comprises an equilibrium renewal process. Let N~(t) denote the number of renewals of the equilibrium renewal process in [0,t]. It is well-known that the stochastic process {N~,(t), t~>0} has stationary incremental properties. Further, let N~(t) denote the number of renewals of the sequence Xn,X/2,... in the interval [s,s + t]. Then {NT(t ), t~>0} becomes a delayed renewal process (associated with {Ni(t), t~>0}) with the first renewal 7i(s), the residual life at time s, and the rest renewals

X/1,Xt-2..... We firstly consider some L~ convergence results for a single renewal process use these results to derive the central limit theorem for {N(t), t>~0}. Lemma 4.1.

{Ni(t), t~>0} and then we

As s--.cc, E[INT(1 ) -Nf.i(1)] I if]--*0, i = 1,2 .... ,m.

Proof. First of all, let us recall the following standard result (see, e.g. Chow and Teicher, 1978, p.101): If {X, Xi, i>~ 1} are random variables on a probability space ( ~ , ~ , P ) , then E[XnIA]---'E[X/A] uniformly for all A E s / as n --* e~ if and only if Xn converges to X in mean, where In is the indicator function on A. Since N[(t) and N~(t) are integer-valued functions, it is sufficient to show that

E[N[(1)Ic I ~] ~ E[N~(1)Ic] uniformly for subset C E {1,2 .... }, as s--* oo. We claim that for fixed n, P{NT(1 ) = n I ~ }

~ P{N~(1 ) = n}, as s ~ ~ . In fact,

n--1 P{N[(1)>~nlo~} = P

7i(s)+ ~-~ X q ~ l j=!

=

f01 P / E [j=l fo P

Xq~
y~Yij<~l-x

j--1

dFrf(x)

as s ~

by Remark of A1, Theorem 3.1 (c) and Lemma 2.1

P{Nf,(1)>~n},

L. Li I Statistics & Probability Letters 35 (1997) 297-305

303

where the empty sum is interpreted as zero. Hence, the claim is proved by observing that P{N/S(1)=nl~-

}=

P{N~(1)>~n - 1 I ~ } -- P{NS(1)>~n [ ~-}. Return to the proof of the lemma. From Al it is easy to see that E[N/(1) I ~ -] is bounded in s>~0 and that E[N~,(1)]0, there exists an N such that

E[N[(1)I[N,o~) [ f f ] < ~

E[N~,(1)I[N,o~)]<~.

and

(4.1)

For any subset CC{1,2 ..... N}, by the claim above, we have E[NS(1)Ic[~]-+E[N~,(1)Ic], as s---~c¢. Since {1,2 ..... N} only contains finite elements, it follows that E[N~Ic [~]~E[N~(1)Ic] uniformly for C C { 1 , 2 ..... N}, as s-~cx~. Whence, combining this and (4.1), we have that E[N~(1)Ic ]~]~E[N~(1)Ic] uniformly on C C{1,2,3 .... }, as s--+c¢. This completes the proof of the lemma. [] T h e o r e m 4.2.

E( Ni(t)t

n r I' ( t )tl l ~

---~0 ast---~c~,

i = 1 , 2 ..... m.

Proof. First of all, note that

E [ Ni(t)t

Nr,(t)t

~] <.E [ N~t)

N~ft)

~I+E

[ NzS,(t)t

Nr'(t) 1

The second term on the right-hand side apparently converges to 0 as t--* 0¢. Thus, we only consider the first term on the right-hand side. For any given e > 0, by Lemma 4.1, there exists a c such that, when s ~>c, E[INfl(1) - Nff,(1)l I ~-]
E

Ni(t) t

N~,(t) ~] < E[[N~(c) - N~,,(c)I I ~] t t

Ei=0 [ E rtlltN ci+ -i ¢ \ l -) _ N~,(1)[I c ] E[[NC+[t-c](t - c - [t -

c]) -

~-]

N~(t - c - It -

c])[ I J~ ]

by the stationary property of equilibrium renewal process

[t - c] + 1

E[IYi(c)

-

t

E[IN;+t'-cl(1) + Since E[lNi(c ) t ---~oo. []

-N~(c)l

Recalling that

N(t)

Nf,,(c)l I ~-] t

I

I o~ ] and E[IN[+['-c](1)+ N~(1)I [ ~ ] are finite, the theorem is proved by letting

= ~=1Ni(t)

and Nr = ~im=lNr,(t),

we immediately have the following theorem.

304

L Li I Statistics & Probability Letters 35 (1997) 297-305

Theorem 4.3.

As t ~

~, 40.

The following corollary is an immediate result from Markov inequality and Theorem 4.3. Corollary 4.4. For any e > 0, as t ~ cx~,

(a) P { N(t)t

N~t)

>~J~}

--~0'

Assume that a ] =E[(Yi-/2i)2] < ~ , i = 1,2 .... ,m. The standard central limit theorem for the superposition process {N(t), t~<0} states that, as t---~cx~,

converges in distribution to a random variable having standard normal distribution (c£, for example, Cox, 1962, p.73). Combining this and Corollary 4.4, we obtain the following central limit theorem for N(t) given ~ . Theorem 4.5. For any given x,

lim P t~o~

N(t) - ~iml t//2i <<.x ~ } v/t E ml

o./2//23

--

q~(x),

where q~(.) is the distribution function of a standard normal random variable. A standard result concerning the variance of an ordinary renewal process, say {Nr,(t), t~>0}, is that E[N2(t)]/t2---~ 1//22, as t--~ cx~. We now extend this result to superposition process {N(t), t~>0} given ft. Corollary 4.6. (a) As t is large, the variance of N(t) given ~ , denoted by V(N(t)]~), is approximately t

E/=I O"21/23, m

m (b) As t ~ ~ , E[N2(t) I ~]/t2 .....> [ ~i=11//2i] 2"

Proof. (a) This is obvious from Theorem 4.5. (b) It is well-known that E[N2(t) I~]-= V ( N ( t ) [ ~ ) ÷ (E[N(t)[~]) 2. Then, (b) follows from (a) by Theorem 4.1 and the fact that E[N(t)[~]/t ~ ~-~iml 1//2i, as t---~ cx~. [] References Baxter, L.A., Li, L., 1994. Renewal theory in a random environment, Math. Proc. Cambridge Philos. Soc. 116, 179-190. Chow, Y.S., Teicher, H., 1978. Probability Theory, Springer, Berlin. Cox, D., 1962. Renewal Theory, Wiley, New York. Doob, J.L., 1953. Stochastic Processes, Wiley, New York.

L. Li I Statistics & Probability Letters 35 (1997) 297-305

305

Karlin, S., Taylor, H.M., 1975. A First Course in Stochastic Processes, Wiley, New York. Kijima, M., Sumita, U., 1986. A useful generalization of renewal theory: counting processes governed by non-negative Markovian increments, J. Appl. Probab. 23, 71-88. Ross, M.S., 1983. Stochastic Processes, Wiley, New York. Teresa Lam, C.Y., Lehoczky, J.P., 1991. Superposition of renewal processes, Adv. Appl. Probab. 23, 64--85.