STATISTIC~ & ELSEVIER
Statistics & Probability Letters 23 (1995) 221 226
On the rate of convergence in the martingale CLT Alfredas
Ra6kauskas
Vilnius University, Naugarduko 24 2006 Vilnius, Lithuania Received January 1994
Abstract Bounds are found on the accuracy of the Gaussian approximation of discreet time martingales with values in a multidimensional space. The bounds are with respect to the Kantorovich and the Prohorov metrics.
Keywords: Central limit theorem; Martingale; Banach space; Convergence rates
1. Introduction and results Let E be a real separable Banach space with the norm [I • I[. By E* we denote the dual space to E. Recall that the Kantorovich metric 11 on the space of probability measures on E is defined by
t,(~,v)=sup{ffdP- ffdv "fe~,(E)t, where ~q~'~(E) denotes the set of bounded functions f : E - - * R that If(x) - f ( Y ) l <~ J]x - y H for all x, y ~ E . The Prohorov metric 7r is defined by
satisfy
the
condition
7r(/~, v) = inf{e > 0:/t(F) ~< v(F ~) + e for all closed sets F c E}, where F ~ = {x e E: infy~r II x - y II < e}. We shall write la(X, Y) and 7r(X, Y) instead of ll(Px, Pr) and lr(Px, Py) respectively, where Px means the distribution of X. We refer to Rachev (1991) for more information about metrics on a set of probability measures. n Assume that ( X i , ~ i ) is a sequence of martingale differences with values in E, S . - - ~ k =lXk. The conditional expectation with respect to ~-k-~ is denoted by E k. Let Qi denotes the conditional covariance operator of Xi given 4 - 1 , Q i : E * ~ E, Qk X* =- g kX* ( Xk) Xk, i = 1 li. Let Y be a mean zero Gaussian E-valued random element with covariance operator Q. The conditions for S. to converge in distribution to Y may be found in Walk (1977), Jakubovski (1988) for Hilbert space valued martingales, Rosinski (1981) and Ra6kauskas (1987) for Banach space case. Recently, Rachev and R/ischendorf(1991) have estimated l~ (S., Y). The bounds are expressed in terms of the so-called smoothing metrics, the finiteness of which seems to be a strong condition in infinite dimensional space. They also required each operator Q~, i = 1 . . . . . n, to equal to a constant operator almost sure. In this paper we prove bounds on ll(S,, Y) and 7r(S., Y) for martingale .....
0167-7152/95/$9.50 © 1995 Elsevier Science B.V. All rights reserved SSDI 0 ! 6 7 - 7 1 5 2 ( 9 4 ) 0 0 1 16-P
A. Ra(kauskas / Statistics & Probability Letters 23 (1995) 221-226
222
differences with finite third moment. We do not assume that the conditional covariances Qi have any special structure. But we put restriction on the support of Pxk and Pr. This appears due to the fact that the absence of any additional information about the martingale may cause an undesirable situation. It is known (see e.g. Bentkus and Ra6kauskas, 1984) that even for martingales with independent differences and finite third moment it happens that there is the central limit theorem but no rate of convergence of neither n(S., Y) nor of ll(S,, Y) to zero. Here, following Bentkus and Ra6kauskas (1984), we consider an abstract Wiener space (E, H) and assume that P { X I e H } = P { Y e H } = 1, i = 1 . . . . . n. We recall that an abstract Wiener space is a quadruple (E, H,j, la), where H is a separable Hilbert space, j is a continuous linear injection H ~ E such that j(H) is dense in E and p is a probability measure on E such that for every x* e E * ,
feexp{ix
*(x)} ~(dx) = e x p { -
2 J;*x* HS,
where j ' i s the adjoint transformation ofj and I" In denotes the norm in H. Let (x, y) be the scalar product in H. Let Qi, Q: H ~ H be the operators defined by Q,ix = Ei(Xi, x)Xi, i = 1 . . . . . n, Qx = E(Y, x)Y. First we consider the case when ~k =1 (~k converge to (~ monotonous. Define L3,H. n
=
~ EIXkl 3,
V3,n,. = E k=~ ~ E k l X k l 2 -- El YI 2 3/2.
k=l
Throughout we assume, that a 2 = •k=l El Xkl 2 = 1. Theorem 1.1. Assume that
(~k <
Q.
(1.1)
k=l
Then there exists a finite constant C such that V 3,H,n)]1/3 ,
(1.2)
n(S., Y) <~ C(L3,H, n + 1: V3,H,n!~l/,*.
(1.3)
ll(Sn, Y) <~ C(L3,H,n
+
For a linear bounded operator S : H ~ H define IS IQ = sup{ I(Sx, x)l: II Q1/2x II ~ 1}. Define ~
W3,H,n=E 1-
k=1 ~ IQkQ
3/2
•
Theorem 1.2. There exists a finite constant C such that
II(S., Y) <~ C(L3,H.n
+ V3,n,n + W3.H,.) 1/3,
(1.4)
n(S., Y) ~ C(L3.n.. + Vs,n.. + Ws,n,n) 1/4.
(1.5)
The quantities W3,, and I:3,. control the eigenvalues of the conditional covariance operators Qk. This is well seen in the case of finite dimensional space E --- R m. In this case we may consider the approximation of S. by the standard normal random vector F. Then we have the following corollary.
A. Ra(kauskas / Statistics & Probability Letters 23 (1995) 221-226
223
Corollary 1.3. Let E = ~m and let 221 ~ ... ~ 22m denotes the eigenvalues of the operator Qi, i = 1. . . . . n. Then there exists a finite constant C such that Ii (S,, F) -%
( ~.[ E II Xk k=l
rr(Sn, F) ~< C
( --~ E II Xk II3 "l- E ( --~1 1 -- ~ ,~,2i )3/2)1/4. k1 i k=l
II 3 + E
( ~, i=1
1-
~ )3/2)1/3 22~ k=l
and
In the general case the situation is similar if the operators (~i and (~ are diagonal. Next let us assume Qi = diag{221,222 .... }, i = 1. . . . . n, and Q = diag{2 2, 22 .... }. Define
T3.H.n=E(k~= 1 22
j~_12j2k
)3/2.
Theorem 1.4. There exists a finite constant C such that ll(Sn, Y) ~ C(L3,n,n + T3,H,~) 1/3,
(1.6)
r~(Sn, Y) <-%C(La,H,~ + Ta,n,~) 1/4.
(1.7)
It is known (see e.g. Paulauskas and Ra~kauskas, 1989) that for the independent differences (Xk) the bounds (1.2)-(1.7) are sharp.
2. Proofs P r o o f of Theorem 1.1. Assume that ~k = 1 (~k = (~ a.s. According to Bentkus and Ra~kauskas (1984) it suffices to estimate I = IE f ( ~ k : 1 Xk) -- Ef(Y)l, where f is three times Frechet differentiable function with bounded derivatives. T o this end we use Theorem 5 from Rhee and Talagrand (1986) to obtain
I <~ Ce -2 ~ EIXk[ 3.
k=l
This estimate results in (1.2) and (1.3) under the stated restriction. N o w let us assume the condition (1.1). Since the operator v = ( ~ - ~k= l(~k is positively definite, due to the M i n l o s - S a z o n o v theorem, for each to there exists a Gaussian measure ~v.~, on H such that Sn(Z, x)(z, y)d~v.,o(z) = (vx, y). One easily verifies, that the map to ~ 7v.,o is measurable if the set of Gaussian measures is provided with the weakest topology that makes each map f---, ~fd# continuous, where f : H ~ R is continuous and bounded. Define t2' = O x H, ~-' = ~- x ~ ( H ) and P ' as a unique probability on (f2', ~-') such that for A ~ - and BeC$(H),
f. P'(A x B) = JA )'v.o(B)dP(to). Define for to' = (to, x),
X[,(to') = Xk(to),
!
k = 1, ...,n, X,,+l(to') = x.
224
A. Ra~kauskas / Statistics & Probability Letters 23 (1995) 221-226
It is clear, that (X~, -~k'), 1 ~< k ~< n + 1, is the martingale difference sequence, where -~k' = tT{A x H, Ae~,~k} for 1 ~< k ~< n and ~.'+1 = ~-'. Define S" = y~+l : ~ Xk., Since
n+l E~(x~, x)(x~, y) : ((~x, y) a.s. k=l
from the first part of the proof we obtain
I~(S~,, Y)<~ C
\k=l
EIX'kl
•
(2.1)
By the triangular inequality and (2.1) t,(S., Y) ~< t,(S., Y) + tl(S', S.)
<~ C(La,u,. + E[ X'.+113) 1/3.
(2.2)
Conditioned on a(X1 .... , X.) the distribution of X,~+~ is Gaussian, so by the well-known inequality for moments of Gaussian random element we have EIX'+~I 3 ~< E(trv) 3/2 = E k~=l EklXk[~ -- El YI~ 3/2. Substituting this bound into (2.2) we finish the proof of (1.2). The proof of (1.3) is similar.
[]
Proof of Theorem 1.2. Define a stopping time ~=min{l:
~ IQi[Q~
Define X~ = Xix{i <~ ~}, i = 1 ..... n. Let (Qj) denote the corresponding conditional covariance operators. Then we have Q >t ~ = l Q~,. So by Theorem 1.1,
l,(S'., r) <~ C(L;.tt,. + V'3,.,.),
(2.3)
where
L3,H,:
:
~
k=l
El X k'l u3 <~L3,.,.
(2.4)
and
V;.H.,=E[~=,~E~lX;,l~-EIYl~3/2=E ~=,~trQk- trQ 3/2.
(2.5)
Since tr Qk <<.C[ QR [e, by the definition of the stopping time z, we have
E k=l~ tr Qk -- tr Q 3/2
C(V3"tt'n-F E k=t+2~ trQk 3/2z{z
<~ C(V3,H.n + W3,H,. + L3,H,n).
(2.6)
A. Radkauskas/ Statistics& ProbabilityLetters 23 (I995) 221-226
225
By the well-known inequality for Hilbert space valued martingales
E IIS'-S"II2<<'E( k =t~+1
Ek'Xk]~l)<~C(V3"tl'n-t-Wa'tt'n-t-L3'H'n)2/3"
(2.7)
We complete the proof collecting estimates (2.3)-(2.7) and using triangular inequality for the metric ll(S., Y). The proof of (1.5) runs similar. [] Proof of Theorem 1.4. We shall construct a new H-mds (X~, 3~-k) such that (a) if we denote by Q~, the conditional covariance operator of X~, given ~ k - 1, k = 1, ..., n, then
k=l
for each m --- 1, ..., n; (b) the operator Q - Ek=, Q:s = drag ' {),,.j}, ' where
(c) EIX~l 3 ~ EIX~l~
and EklXk -- X~l 2 ~< trQk -- tr Q~,, k = 1 .... , n.
To this aim we consider Xk as random elements in H and we start with the construction of X~. Define V: H ~ H by V = dia#{min {1, ),J2u, j --- 1 .... } }. Then put X~ = VX,. It is clear, that X~ is ~'~-measurable and Q~ = VQ, v. One easily checks, that both conditions (a) and (b) are fulfilled. From the definition of the operator V it is clear that V~< I, where I denotes the identity operator. So EIX'~I3n <~EIX,I~. Since the conditional covariance operator of X 1 - X ~ is equal to ( 1 - V)Q.I(I- v), we have E I X , - X ~ I 2 ~< tr Q1 - tr Q'I. Assume that the random elements X] .... , X~, are constructed. Then the construction of X~,+ 1 goes as follows. Define V=diag{min{1,2'~J2(,.÷~)j,j=l .... }, and X~,,+,=VXk. Then Q'm+I=VQ,,V = diao {min {2~, ),~,.+ ])j, j = 1, 2 .... }. By the construction it is evident that Q >/Q~ + ... + Q~,+, and using inductive assumptions we deduce ,2 -'~(m+ l ) j - -
,2 - ~mj
min{2~,2, 2~m+,)j} ~< 22 _
m_+L,'Z2~S " k=l
To prove (c) we observe that still V ~< I and the conditional covariance operator of X , - X'~ given ~-m is equal to (I - V)Qm+l(I - V). t It t Set S. = ~k=, Xk. By triangular inequality and Theorem 1.1,
ll(Sn, Y) <<.l,(Sn, Sn) + l,(S" Y) (
<.C ~=,~EIX;,I~+Etr Q - k ~
n
, x~3/2~1/3
Q~)
)
+E[IS.-S;,II.
(2.8)
By the well-known inequality for Hilbert space valued martingales
EIISLS"II2<~E( ~
,
(2.9)
A. Ra~kauskas / Statistics & Probability Letters 23 (1995) 221-226
226
From (c) we have
EklXk--Xi, I2<<. ~, k=l
(trQk--trQ~)~
k=l
22- ~ 22.
j=l
(2.10)
k=I
From (2.8)-(2.10) we obtain ll(Sn, Y) <~, C(L3,H, n --I- V3,H,n) 1/3.
The proof of (1.7) is similar, therefore we omit it.
[]
Acknowledgement The author would like to thank Professor Rfischendorf for a useful discussion.
References Bentkus, V. and A. Ra~kauskas (1984), Estimates of the distance between sums of independent random elements in Banach spaces, Theory Probab. Appl. 29, 50-65. Jakubovski, A. (1988), Tightness criteria for random measures with application to the principle of conditioning in Hilbert space, Probab. Math. Statist. 9, 95-114. Paulauskas, V. and A. Ra~kauskas (1989), Approximation Theory in the Central Limit Theorem. Exact Results in Banach Spaces (Kluwer, Dordrecht). Rachev, S.T. (1991), Probability Metrics and the Stability of Stochastic Models (Wiley, Chichester-New York). Rachev, S.T. and L. Rfischendorf (1991), On the rate of convergence in the CLT with respect to the Kantorovich metric, preprint. Ra~kauskas, A. (1987), On the martingale central limit theorem in Banach spaces, Lecture Notes in Mathematics, 329-336, Vol. 1391. Rhee, W.S. and M. Talagrand (1986), Uniform bound in the central limit theorem for Banach space valued dependent random variables, J. Multivar. Anal. 20, 303-320. Rosinski, J. (1981), Central limit theorem for dependent random vectors in Banach spaces., Lecture Notes in Mathematics 157-180, Vol. 939. Walk, H. (1977), An invariance principle for the Robbins-Monroe process in a Hilbert space, Z. Wahrsch. verw. Geb. 39, 135-150.