StochasticProcessesand their Applications2 (19 74) 115-126. 0 North-HollandPublishingCompany
Allan GUT Universityof Uppsaiaand Institut Mittag-Leffle Djursholm,Swjeden Received 15 March 1973 Revised 17 September 1973 Abstract. Let Sn, n = 1, 2, .... denote the partial sums of integrable random variables. IVoassumptions about independence are made. Conditions for the finiteness of the moments of the first passage times N(c) = min {n: Sn > ca(n)), where c > 0 and a(y) is a positive continuous function on [0, +, such that a(y) = o(v) as y --) ~1,are given. With the further assumption that av) = yp, 0 5 p < 1, a law of large numbers and the asymptotic behaviour of the moments wher; c + 510 are obtained. The corresponding stopped sums ale also studied.
AMS Subj. Class.: Primary 60G40; Secondary 60G45,60K05 first passage time
mar tingale
stopping time
1. Introduction Let X1, X2, .. . be a sequence of integrable random variables, let Tn = cJ{Xl, x2, . ... x,1
To = (8, w, set So=O,
Sn=5Xn,n=1,2
,....
?J=l No
assumptions about independence are made. Let the conditional expected values be
=
v=l Thus {S, - T,,,
vyn
=
1,2,....
l,“=1 is a martingale.
,
n =
1,2, .* ; l
116
A. Gut, On fast passage timcsfor sumsof dependent random varMles
The following conditions are considered (cf. [ 3 J ): (i) T,/n -+ 0 uniformly on X2\ A, where 8 is a positive finite constant and A ianull set. (id) SlUpnE{IXn-mnl’I Fn__1) <, Kr < 00, rZ 1. ((Xn-mn)+)“I Tn-1) S K< 00 for some a > 1. IXn-mnIaIFn-l}SK’<~ forsomear> 1. Obviously, (iii) and (iii’) are contained in (ii) if Y> 1. We now introduce the first passage times N(c) = min{n: Sn > m(n))
,
(1)
where c 2 0 is a constant and aQ) is a positive continuous function on [0, =) such that a@) = o(y) as y + 0~. The purpose of this paper is to prove that conditions (i)-(E) imply finiteness of the rth moment of the first passage times and the correeponding stopped sums, r 2 1, and to determine their asymptotic behavfour as c + = when conditions (i), (ii) and (iii’) hold and when some further assumptions are made about a@). Also, a strong law of large numbers is obtained. If a(y) = yp, 0 < p < 1, and if r = 1 and r = 2, most results are already proved in [ 31. IfXl, x2, . . . are i.i.d. random variables with 0 < E(X) = 8 < m, then mn = 8, Tn = no, and so (i) is trivially true, and (ii) reduces to the assumption that E{ 1x1’) < 00. Hence the theorems reduce to results proved in [7 1.
Theorem 2.1.
Let
r >,
1. If conditions (i)-(iii) are satisfied, then
(a) Em? += (b) E{S’} < 00.
emark 2.2. Since ( , where Y is a random variable and99 asing function of r (see 19, p. 348]), it follows a a-field, is a non-d that (ii) also holds when P < Q, if it holds when r = ro. Thus it is always possible to choose K,. such that c = KFO) Y < ‘r, . henr= 1 andr=2andaQ)=yP,O<:p< l,Theorem2.1 isproved in [3]. The proof follows the same lines as the proof of [ 7, Theorem 2.1 a], in the sense that the following lemma is the crucial
A. Gut, On first passage times for sums of dependent random variables
117
ma 2.3. Let Nn = min{N, n}, n = 1,2, . . . . and suppose that (ii) holds. Then: (a) If 1 < r < 2, then
(2) w!$ere Cl is a constant depending only on Y and Kr. (b) If r > 2, then
(3 where C2 is a constant depending o&y on r and K,. Proof. Define k
(XV--mJI(N,>v),
Uk=g
k=l,2,...,n,
v=l
where I(.} denotes the indicator function of the set in braces. Since IS,- Tn* 9, >;=I is a martingale, it follows from [ 5, Theorem 2.1, p. 3003 that {&};=r is a martingale. In particular, Un = sNn - TN~. (a) Let 1 < r 5 2. By [ I, Theorem 91 (see also [2, Theorem 3.21) we obtain E{IU,I’}
<
B,E{
iZA~l(Xv-mv~21r'2~ =
3
where B, is a constant depending only on r. The c,-inequalities, Wald’s identity for martingales [4, Lemma 61 and (ii) further yield E{IBfln,(X -m = V
V
[ 5,
p. 30219and
)21r/2) < E{zlNn v=l ix, --muI’ 1 =
E(Zf$ =
E(IXv - mJ 1 S_l 11
5 K,E(N,,).
Therefore “(Is& - TNnlr 15 Br K, which proves (2). (b) Now let r > 2. By [2, Theorem 2 1.11 we obtain Nn v=l
W
A. Gut, Qn ftist pasmge timesfor sums of dependent random variables
118
where again B, is a constant depending only on r. But by (ii) and Remark 2.2, ?&I) ’ ri2j<- E{(N,K,y’*}
WV-m,)2I
and by (ii) and [4, Lemma 6 I {(sup{~xV-m,l:
15
5 KrE{N;‘2}
,(7)
)
z~51y,)~)‘)
5
EI~~~~IXV-~
V
I’)
E{lX, - m,l’ 1 TJ-1))
{Z;$
<, E(N, K,) <, &E{N,“z 1.
(8)
Hence JUIS,, - TN/}
< 2BrKrE{N;/2)
which proves (3).
,
(9) ._
.
Remark 2.4. In fact it has been proved that, if r > 2, E(lSN, -TN/}
5 BrK;‘2E{N;‘2}
+B,K,ECN,)
.
1
(9 1
Remark 2.5. The proof shows that one can choose Cr =B, Kr and
Cz= 2B,K,. Roof of Theorem 2.1. Let e, 0 < E < 4 8 be given. From (i) it follows
that (&-e)N<
TN +L,
onSZ\A,
(10)
where L, is a constant depending only on e and 8. Also, lmNlSi2eN+L2
‘1 11 1)
onS1\A,
where only on e and 8. _. - L, is a constant depending Let 11 YIIr denote ( {1Yl’})l/’ for a random variable Y. Then IIT, II, ’ IIS’ 0 TN II, + IIS’II,
From ca(N) < S” 5 ca(Nobtain
1) -I-
,
(12)
l
(ii), (11) and
[
,
Lemma 61 we
119
A. Gut, On fil-st passage times for sums of dependent random variables
clla(N)IIr ’ IIs, II, ’ cIla(N-
‘)llr + IIX, II,
2 Clla(N- !)II, + IIXN - m’jj/II, + Ilm, II, <
CII@V-
l)& + (E{X$,
@iXv - mvlr I ‘3v_1)))1/r
+ 2EIINllr + L, -< cIld~-
1311 r + (EZNKr})‘Ir
(13)
+ 2e1livl1,-i-L,.
Obviously ( IO)-( 13) also hold for Nn. Therefore ce - 3’) IINnII, IS llSN,- TNn llr 9 L, + L,2 +
c Ib(N, -
l)IIr
+ (UN,
K,j)llr
(14)
First, let aQ) = 1. The theorem is then true when r = 1, so assume that0 1. If 1 < Y<, 2, it follows from (2) and from the fact that E{N) < 00, Ce- 3e)IINn II, < - (C,E(Nn })"'+ L, + L, + c+ (E{N, Kr))‘lr
-ec+(C
1 E(N))‘I’-+(K
r E{N})llr+L,+L,
.
(15)
Hence, by Fatou’s lemma, E{di}e = and, by (ly), E(Sb)e 00. If2
+L, em.
06)
Hence, by Fatou’s lemma, E{N’) e w,and, by (13), Finally, suppose that 2k < br< 2w k 2 2. The conclusion follows by induction on k. If the assertion ho& up to some k, then, since e of ( 16) is again finite, a 2k-1 < #r 5 2k, the right-hand .
120
A. Gut, On first pmagetimes for sums of dependent random variables
6 > 0 an a0 such that c&y) 5 a0 + 6y for ally 2 0. Choose 6 < 8 and define N* = min{n: Sn > a0 + Fin). Now N * =min{n:S,-6n>ao}=min{rs:Z~=1(X,-S)>ao}=N*(ao). The sequence {X, - 6};=, obviously satisfies conditions (i)-(iii) if the sequence {X,}t=r does (although with different constants). Hence, from above, E {(N*(ao ))’ } < 00, and since N < N’(ao), it follows that {M} < 00. From ca(y) < a0 + &yl y 2 0, and ( 13), it further follows that IIS,, II, 5 a0 -r 6 IINII,+ (E{NK, #”
-t 2NV4, + L, < QO ,
since E(N’ ) < 00. Thus E(Sb} < 80, and the proof of the theorem is complete. The following corollary is a consequence of Theorem 2.1 and Lemma 2.3. Corollary 2.6. Suppose that conditions (i)-(iii) are satisfied. Then: (a) If 1 5 r 5 2, then E{IS,-
7’J)
5 C, E(N) < 00,
(17)
where ‘C, is a constant depending only on r and K,. fb) ,rfr > 2, then
where C2 is a constant depending only on r and Kr.
Proof. By Fatou’s lemma,
Et&,/-
TNl’) 5 ljm E{IS’ n-,-
n
-T”
n
I’} e
Furthermore, Nn S N. These results combined with Lemma 2.3 prove the first inequalities if r > 1. he finiteness follows from Theorem 2.1. If r = 1, then by [4, Lemma 61, (ii), and eorem 2.1,
A. Gut, On fht passcxgetimesfor sums of dependent random variables
{IX V -m
V II
Yv_1l3
121
5
=max(Q l~k~n}andleta(y)~l.Since(N>n}=: simple calculation shows that (IV } .< = if and only if M, 5 c] < 00. Hence the follo ng corollary is true. Corollary 2.7. Let r 2 1. Suppose that conditions (i)--(G) are satisfied. Then
This
generalizes [S, Theorem 31, where the i.i.d. case is considered for r= 1,2,3,... . 3. Asymptotics Theorem 3.1. Let r 2 1 and let N be defined by (l), but with the additionulassumption that a(n) = np, n = 1,2, ... . 8 < p < 1. Set q = 1 -p. If conditions (i), (ii) and (iii’) are satisfied, then 0a c-l/qN a$*$-l/q asc+=; + fj-‘h asc+=; (b) C-r/4 E{N’) (c) c-‘fq E(S;;) + O--‘P’qasc+w.
When Y= 1 and r = 2, results similar to (b) and (c) have been proved in [3]. The following result was communicated personally to me by Professor Carl-Gustav Esseen. Lemma 3.2. Let {Zn}~zl be a martingale, 2 = 2& Yv, n = 1,2, ... . Suppose that E{ I Yn Ip) < = for all n, 1 <, p 2 2. If there exists a sequence $?t< of real numbers {a,},“=l, a1 2 a2 2 . . . 2 a, I 0, such thlat < 00, then a.s. asn+=. a, i!$ -,+
A. Gut, On firstpaslsagetimesfor sums of dependent random varkbles
122
If p = 2 the result coincides with [6, Theorem 2, p. 2381. roof. (a) By (ii),
An application of Lemma 3.2 with a, = l/12, Yn= Xn -m,, p=2yields $(Sn-Tn)aG*
0 asn+w
2 =S - Tn, n n
.
Bv _ (i) it follows that
and hence Ix
n
-Is
n -c
n -n--l
l
‘n-1
’ -
7
’
Y*
0
aarl y1+
00 .
Since
it follows that N(c) a*s* + +c= as c + 00.From tlx2t 1
i4v
1 + 0 and -X ‘2’0 N AJ
a.s.
[
10, Theorem 11, it follows
asc+m.
Since cNP < SW 5 cNp + XN , it follows that cNp a,s. -=+e
N
asc+w
m
+
which yields (a). y (a) and Fatou’s lemma,
us it remains to prove
(21)
A. Gut, On f&St passage times for sums of dependent rmdom variables
123
Set X = (c/8) l/q. Then we have to prove lim A-’ {A!V}<
1 .
Suppose that r =
1.
(22)
1
c+-
Chow [3] has proved. that
If p = 0, there is nothing more to prove. If 0 < p < 1, then
and so li=nA-‘E(N)
I 1.
C-s-
Letr>
LSinceEQN’)<~andsince~b(N)=W,(lO)-(i13)yield
(e-3E!!tNIi,ICtlNPli,+IIS,-TNIlr$(KrE(N))’I’+L,+L,
.
(23)
Since h-l cllNp 11,= 8ilNl$~,, (23) and Corollary 2.6 yield (e - 3E)llx-‘Nllr
5 eiia-‘Nil;, tX1(L,+L2)
+ X(1’+- qcy
E(N})‘l’
+ $“‘)(I+
if 1 < rZG2,
+ X(‘lr)-’ (?dX(NK
r
))“‘+
(24)
h-‘(L, + L2)
if r> 2 125) \ .
Note that IlNp11,= /N$I = 1 if p = 0. If 0 Si p < #9 then llX-1 Nllf,, , and so G{(e
- 3t~)llh--~Nll,~ < iG~ellx-lNllfll?
C-+0
inccE,O< e<
1 = 8 if
1 K
IIX_'Nll$
r S
c+-
s.
5
.
124
A. Gut, On first passage thes for sumsof dependent random variables
Furthermore,
} c lim(e IIx-‘Nlq2 + h -1!2C11rllh-1Nll#~ 2
if r > 2.
(27)
c4m
The conclusion follows by an induction argument of the type given in the proof of Theorem 2,l. If 3 < p < 1, then Ilh-‘N4.1, < Ilh_lNIIpP, and hence
(27’) The proof now follows by induction on k, where k is defined by (11~)~ < r < (l/~)~‘l, k = 0, 1, 2, .. .. but since i < l/p < 2, (26’) is used when k = 0, 1,2, . ... k,, where (1 /pjko < 2 < ( l/pjko+l, and also if(l/pjk” < r < 2, and then (27’) is used when k = k, + 1, k,+ 2, .. . . Thus (b) is proved. (c) The fact that a(N) = Alp and ( 13) yield
Since h = ,[c/6j1‘4 , we have I:Oprove 1i.mX-'E{SrN)
c-+*
=
0’ .
(29)
By (28) ard the fact that I&ll~II,
= 8llN$,, it follows that
{A/M,))‘/” + A-‘t,.i Thus
(30) 6 I lim llX%S~JJr 5 L cc
and the conclusion follows,
e+-
1WS,II,
ZG0 + 2e ) .
A. Gut, On first passagetimesfor sumsof dependent random varhzbles
125
Finally, a slight generalization of Theorem 3.1(a), (b) is mentioned. Let N be defined by (l), but with the additional assumption that ab) is non-decreasing, concave and differentiable for sufficiently large values of y, and regularly varying at infinity with exponent p, 0 5 p < 1, i.e. a@) = vUQ), where LQ) is slowly varying at infinity. (About regular and slow variation, see [6 ] ). This class of barriers has been studied in [ 111 and later in [7]. Furthermore, let X = X(c) denote the solution of the equation ca@) = ~8. If c is sufficiently large, A is unique. Theorem 3.1’.Let I > 1 and let N be defined as above. If conditions (i), (ii) and (iii’) are satisfied, then . (a) X-rNy*l asc-+*; (b) IFE{M}
+ 1 asc+w
If L(y) = 1, X is as defined in Theorem 3.1 and the theorem reduces to Theorem 3.1 (a), (b). Roof. (a) The arguments in the proof of Theorem 3.1 (a) and the fact * that ca(h) = A8 yield
from which (a) follows by [ 12, Corollary 2.21. (b) By (a) and Fatou’s lemma, lim h-‘E(N’)
2 1.
c-,-
To prove the opposite inequaltiy, we define (cf. [ 111) P
=N*(c) = min{n: Sn > Bp(h)n + 01(1-&7(h)))
(
= min n:
xv- 8p(h) e(l -p(M)
> AI
’
where p(X) = Xa’(X)/a(X). By 17, Lemma 3.13, p(h) + p as X + 00 and except for obvious modifications it is possible to prove that FE{(W)‘) -+ 1 asX + 00, by the arguments given i,n the proof of that lemma (where these results a
126
A. Gut, On first passage oimesfor sumsof dependefat random variiables
point (A, M), it follows that N < N*, and since c -+ 90 implies that A + 00, we obtain _-lim A-’ {Ar}Sl. ,c-“00 ote. This is a revised, and slightly condensed version of a technical report, which appeared before [2]. The report contained a proof of Lemma 2.3ib) which was ba.sed on the ideas of [7].
eferences D.L. Burkholder, Mar tingale transforms, Ann. Math. Statist. 37 (1966) 1494-1504. D.L. Burkholder, Distribution function inequalities for martingales, .4nn. Probability 1 (1973) 19-42. Y.S. Chow,.On the moments of some one-sided stopping rules, Ann. Math. Statist. 37 (1964) 382-387. Y.S. Chow, H. Robbins and H. Teicher, Moments of randomly stopped sums, Ann. Math. Statist. 36 (1965) 789-799. J.L. Doob, Stochastic Processes (Wiley, New York, 1953). W. Feller, An Introduction to Probability Theory and its Applications, Vol. II (Wiley, New York, 1966). A. Gut, On the moments and limit distributions of some first passage times, Techn. Report 43, Dept. of Mathematics, Univ. of Uppsala (1972); Ann. Probability 2 (2) (1974), to appear. C.C. Heyde, Some renewal theorems with application to a first passage problem, Ann. Math. Statist. 37 (1966) 699-710. M. Loeve, Probability Theory, 3rd ed. (Van Nostrand, New York, 1963). W. Richter, Limit theorems for sequences of random variables with sequences of random indices, Theor. Probability Appl. 10 ( 1965) 74-84. D.O. Siegmund, Some one-sided stopping rules, Ann. Math. Statist. 38 (1967) 1641-1646. M. Sreehari, On a class of limit distributions for normalized sums of independent random variables, Theor. Probability Appl. 15 (1970) 258-281.