PHYSICS REPORTS (Review Section of Physics Letters) 77, No. 3 (1981) 239—247. North-Holland Publishing Company
Markov Processes, Random Fields and Dirichiet Spaces* E.B. DYNKIN Department of Mathematics, Cornell University, Ithaca, N.Y. 14853, U.S.A. Contents: I. Introduction 2. Symmetric Markov processes 3. Random fields 4. Path integration of generalized functions
239 239 240 241
242 243 246 247
5. Dirichlet spaces 6. Interactions Appendix References
1. Introduction It seems that the path integration extended to a certain class of generalized functions can be a useful tool in the Euclidean quantum field theory and statistical mechanics. This is an example of a direct application of the probability theory to physics as opposed to more common applications via classical analysis. We establish the following facts. A. The free field is the Gaussian random field corresponding to the Brownian motion killed exponentially with the rate A (equal to the squared mass). In general, to every symmetric Markov process X there corresponds a Gaussian field I~. B. The prediction problem for ~Pcan be solved in terms of the hitting probabilities for X. C. The field iJ has the Markov property on sets B, C if and only if it is impossible for X to reach B from C without crossing B fl C. D. The Dinchiet space D associated with the process X can be interpreted as the configuration space for the field P. Every symmetric k-linear form on D describes an interaction in the continuous medium determined by ~P. The results A, B, C have been presented in more detail in [1].
2. Symmetric Markov processes Let m be a measure on a space E. A measurable positive function p~(x,y), Markov transition density if
Jpt(x,y)m(dy)1
for all
t,
x
Research supported by NSF Grant No. MCS 77-03543.
t
>0, x, y E E is a
241)
New stochastic methods in physics
and
J
Ps(X,
y) m(dy)p,(y, z) = p5÷t(x,z)
for all s,
t,
x, z
-
The corresponding Green’s function is defined by the formula
g(x,y)= Jpt(x~y)dt.
With every transition density, a Markov process X = (xe, P~)in E is associated: x~is the state at time t, Px is the probability distribution on the space of paths given x0 = x. The process X is called symmetric if p1(x, y) p,(y, x) for all t, x, y. Example 1. The Brownian motion in the d-dimensional Euclidean space with the killing rate A 0 is the symmetric Markov process corresponding to the Lebesgue measure m and the transition density 2 Pt(X, y)
=
e_T2/2t
e_At(2lTtycu/
where r = I~xI is the Euclidean distance between x and y. The Green function is (2A)~”2e~”~’ for d = 1 and i~312(2r)’e_~J2~for d = 3. Instead of exponential killing, we can terminate the process at the first exit time from a domain B. For instance, the Brownian motion on the positive half-line (0, ~)corresponds to the transition density —
pt(X, y) = (2~t)_h/2(e_~2/2t
—
e_~xI2I2l).
It has Green’s function g(x, y) = min(x, y). Example 2. Let E be finite or countable. We consider a measure m(B) equal to the cardinality of B and we introduce a transition density as a family of matrices P 1 = etA. Green’s function is described by the matrix —A~.If A is symmetric, we have a symmetric Markov process.
3. Random fields We note, that for every real u1,
UIUk g(x1, xk) i,k
=
J (J dt
~
...
,
U~Pr/2(Xk,
z)) m(dz) 0.
E
Hence, if g(x, y) is finite, then there exists a Gaussian family of random variables Eco~= 0,
Eco~co~ g(x, y).
Wx,
x E E such that
E.B. Dynkin, Markov processes, random fields and Dirichiet spaces
241
Our construction has to be modified if g(x, x) = ~ (for the Brownian motion this happens if d
2).
Instead of points, we use measures as the indices. It is natural to set =
if
ço~is
~‘
J
~(dx)
defined and the integral makes sense. For two measures
~,
i’
we have EcoMco~=
(~,i.’) where
~!
J~(dx)g(x,y)v(dy).
Put ~ E M if (~, j~)< ~. For every symmetric process X there exists a Gaussian family of random variables ~, ~ E M such that Eço1.
0,
Eço~ço,.= (au,
~).
We call c1’ = {~, j.~E M} the Gaussian random field associated with X. (It is defined on a probability space (.Q, P) which has no relation to the paths of the process X.) Let M(B) stand for the set of all measures ~ E M concentrated on B. The family ~PB = {~, ~ E M(B)} describes values of the field P on the set B. We say that ‘i’ has the Markov property on a pair of subsets B, C of E if ‘~k8and ‘1k are conditionally independent given ‘I’~~c. Under certain regularity conditions on the process X, the following results are true. Theorem 1. The conditional mathematical expectation of ~. given ‘1~~ is ~ where ILB is the probability distribution of x~at the first hitting time r of B assuming that the initial probability distribution is ~. More precisely, ~B(C) = Pw{x E C} where ~ is any a--finite measure and P1. is the integral of P~with respect to /~. Theorem 2. The field P has the Markov property on sets B, C if P~{x1~ B flC for 0
t s u and x~E C for some s
(1)
=P~{x1~BflCfor0t~u and x~EBforsomes0, xE B, yE C.*
4. Path integration of generalized functions Proof of theorems 1 and 2 is based on the following expression
Eço1.~=
P1.
(x1) dt
(2)
isa revised form of condition 6.l.A in [I]. The author is indebted to B. Atkinson who has mentioned to him the inaccuracy of the original wording of 6.l.A.
242
New stochastic methods in physics
where ~ is the life time of the process X and dvldm is the Radon—Nikodym derivative which always exists as a generalized function. The path integral of such a function over an interval I is defined by the formula J4!..(x)dt=limjp(x)dt dm t
I
where p~(x)
J
p~(x,y)j~(dy).
The limit in quadratic mean exists for every i.’ E M. Moreover, there is a continuous measure A~,(dt)on the real line depending on the path, concentrated on the life interval (0, ~) and such that A~(I)=
f~-!~(xt)dt
a.s.
for every interval I. We call A~ the additive functional of X corresponding to ii E M. We can imagine a clock attached to the moving particle, the speed of the clock depends on the position of the particle. If i~ is concentrated on B and if x, is not in B for all t E I, then A~(I)= 0. In particular, A~ is the local time at point c if is concentrated at c (which is possible if g(c, c) < cc) To prove Theorem 1, we need to establish that ~‘
Eço 1.ço~,= Eço1.~ço... for all
ii
(3)
E M(B). By (2) this is equivalent to the equation
P1.Ar(0, cc) = P1.BA~(0,cc)
Since x1E B for t E (0, r), we have A~(0,cc) P1.A~(r,cc) =
=
A~(r,cc), and (3) is equivalent to
P1.BA~(0,cc)
which follows immediately from the fact that the path y1 = XT+t has, with respect to P1., the same probability distribution as the path x1 with respect to P1.~(the strong Markov property of the process X). After (3) is established, the proof of Theorems 1 and 2 can be completed without any difficulty (see [1] for detail).
5. Dirichlet spaces A Hilbert space D called the Dirichlet space is associated with every symmetric Markov process [21. For the Brownian motion this is the Sobolev space consisting of square integrable functions with square
E.B. Dynkin, Markov processes, random fields and Dirichlet spaces
243
integrable first order partial derivatives. Generally, elements of D are classes of rn-equivalent functions. Under the regularity conditions on X, mentioned in section 3, it is possible to consider only functions which are right continuous along almost all paths. Two such functions are rn-equivalent if and only if they coincide quasi-everywhere (that is coincide along almost all paths). The inner product (~.t,ii) in M introduced in section 3 can be naturally continued to the set M of all signed measures jL ii, ~, ii E M. Let N stand for the completion of M with respect to (u, ~). For every jz E M, Green’s potential -
—
f1.(x)
=
J
g(x, y)~(dy)
belongs to D and the inner product (f1.,f~)in D is equal to (es, 1/). Moreover (f1., h) = js(h) for all h E D. This makes it natural to consider N as the dual space to D. The mapping jz ~ can be extended to an isometry of N onto D, and both spaces are isometric to the Hubert space H linearly generated by p1.’ j~E M. It turns out that the orthogonal projection of the random field ‘I onto an arbitrary finite dimensional subspace of H is a random field with sample functions in D. The situation is described more precisely by the following proposition. Theorem 3. Let ç~.be the orthogonal projection of ~. onto a finite-dimensional subspace H of H. Then there exists a function ~~(w) measurable in x, w and such that: (a) for every w, q’(cu) is a function of D; (b) for every 4u EM, ~1.(w)= f ~~(w)~i(dx). We call ~(ü) the orthogonal projection of ~J’onto H. 2) = 1. There exists Proof. It is sufficient to prove theorem one-dimensional H. Letsequence YE H, E(Y a sequence v~E M such thatthe tp~ Y in for H. aHence ii. is a Cauchy in M, f~is a Cauchy sequence in D and there exists a limit! of f~in D. We have Eip 1.ço~= (~,ii,~)= (f1.,f,,,,). Passing to the limit, we get Ew1. Y = (f1., f) = ~ (f). Thus ~. = YE(~1.Y) = ~t (f) Y, and the function ~(ü) f(x) Y(w) has properties (a), (b). —*
6. Interactions The Hubert space H introduced in section 5 is a subspace of a larger space L which consists of all square integrable functionals of ~ (We say that a function Y(w) is a functional of li if Y is measurable with respect to the a--algebra ~ generated by the field.) A theorem due to Kakutani, Ito and Segal (for a proof, see, e.g. [1], Appendix) establishes an isometry between L and the Fock space 1(H) which is defined as follows. We consider symmetric real-valued functions F(wt, ... , wk) of k variables ,~ Eli measurable with respect to ~x x ~ and we put ...
(Fi,Fz)=k!J...JF1(wt
Functions F subject to condition (F, F) < cc form a Hubert space H~.The Fock space 1(H) is the orthogonal sum of H~for k = 0, 1, 2 In statistical mechanics, an interaction potential is defined in terms of configurations. In our case, the
244
New stochastic methods in physics
role of configurations is played by functions of the Dirichiet space D. We define an interaction potential of the order k as a symmetric k-linear form A(f1,..., fk) on D. The space of all such forms can be naturally identified with the kth symmetric tensor power NFt~of the space N dual to D. Theorem 4. Let ~~(w) be the orthogonal projection of ~Ponto a finite-dimensional subspace I~Iof H. The functions wk)
=
A(~.(w’),
. -.
,
(4)
~.(wk))
1” into H1’~1with the following belong to if H~for exists sequence an isometry A -+FA of N property: H~,n =all1,A2, of.. Nk1. is anThere increasing of finite-dimensional subspaces of H with the union everywhere dense in H, then, for every A E N~1,functions (4) corresponding to H~converge to FA in HuIhI. The mapping A FA is an isometry of N1’~onto H1”1 if the following condition is satisfied —~
J
p 1(x, x) dt < cc for all 6 > 0 and all x.
(5)
Proof. The space N’~is linearly generated by the elements of the form (6) where ~ means the sum over all permutations (i1, FA(w’, -..
~1.it(~)
wk)I~
-- - ,
i~)of (1,.., k). Put
~1.~k(w)
It is easy to check that, for two elements A, A’ of the form (5), (FA, FA’) = (A, A’)
=
~
~
(A, A’)
where
I~1) (z~,i~i~)
1”1 into is theBy inner product in [1], N”~.Hence mappingwith A -.H1” FA can be the continued to (5). an isometry of N H1~. Lemma 0.1 of the imagethe coincides under condition Using the isometry between H and N established in section 5, we define a subspace N of N corresponding to ci. Let H~”1be the image of H~”1under the mapping A FA. We claim that FA defined by (4) is the orthogonal projection of FA onto H1”. For A of the form (6), this follows from the expression —*
=
~
~ 1.1(w)
...
~1..(wk)
.
1”1, it is sufficient to approximate A by linear combinations To prove our(6). statement for of an Theorem arbitrary 4A can E N be completed in an obvious way. of elements The proof
E.B. Dynkin, Markovprocesses, random fields and Dirichiet spaces
245
We denote by 1(N) the orthogonal sum of N1~,k = 0, 1,2, ... and we continue the mapping A —~FA to an isometry of 1(N) into 1(H). Let IfrA(co) stand for the square integrable functional of the field ~P corresponding to FA by the Kakutani—Ito—Segal theorem. If EçfrA = ZA
1A
A =A
8+A~,
ABEI(Ns),
A~EF(N~).
(7)
We say that A determines a local interaction if the representation (7) is possible for every pair B, C, covering E, subject to the condition (1). An important class of local interactions corresponds to measures on Ek. For every such a measure we put (/L,~L)=
JJ 5
E~’
~(dx)g(x,y)/L(dy)
E
where g(x,y)=g(xt,y1)~~~g(x”,y”)
forx=(x’,..., xk), y=(y’,.., y”).
If(~,~)
defines a symmetric k-linear form on D, i.e., an element of N1”. The corresponding interaction is local if ~ is concentrated on the diagonal {x: x~= = x”}. For such measures the condition (~t,LL)
JJ/L(dx)g(x,y)~.u(dy)
Denote by ~
the set of measures on E subject to condition (8). The random field
(8)
~
1,t
important role in quantum field theory. If g(x, x) is finite, we can write
E J~f~ plays an
246
New stochastic methods in physics
Generally, this formula has only a symbolic meaning. The generalized field
~
is known as the kth
Wick’s power of the field P and is usually denoted as Appendix Theorem 2 contains only the “if” part of the statement C in section 1 (which is the most important for applications). The “only if” part demands some additional assumptions and we prove it here. Let r~= inf{t: t> s, x1 E B}, TB = T°B. A set B is f-closed if P~{TB > 0} = I for all xE B. Theorem 2a. The Markov property of the field on sets B, C implies condition (1) if the following additional assumptions are satisfied: A. Process X is transient i.e. JE g(x, y)f(y) rn(dy)
h(x)=P~{x0EC,x,EBflC for0t
forsomes
By condition A, there exists a function 1>0 such that the measure ~t(dx) = 1(x) rn(dx) belongs to M. Let p be the restriction of ~ts to C. We have PB(E\C)— P~{x~E E\C}
P~{x0EC, x~EE\C}P~,{0<‘TB<
ii
v(h).
If 4 has the Markov property on B, C, then the orthogonal projection ~ of p~,onto H(B) belongs to H(B fl C). Hence E~(~1. — ~‘P1.Bcc)
0.
By formula (2) of section 4, this is equivalent to the equation
P~
J1(xt)dt=0.
Hence TBflC = 0 a.s. P,,~and, by condition B, and h = 0 m-a.e. Now by the Markov property of X,
~B
is concentrated on B fl C. Therefore 0 =
v(h) =
m(lh)
0JPs(x~Y)h(Y)m(dY)Px{0
and, by Fatou’s lemma, h(x) = 0. We have proved the second equation (1). The first equation follows from the second one by the transposition of B and C. Remark. All conditions A, B, C are substantial. In the following three examples condition (1) is not satisfied despite the fact that q5 has the Markov property on B, C because H(C) is the null-space.
EB. Dynkin, Markov processes, random fields and Dirichiet spaces
247
(i) X is a symmetric Markov process on the space of two points E = {b, c}, B = {b}, C = {c}. Assumption A is violated. (ii) X is the three-dimensional Brownian motion, C consists of one point, B = E\C. Assumption 2 is violated. (iii) X and C are as in the second example. B is a closed set which is disjoint from C. Assumption 3 is not satisfied.
References [1] E.B. Dynkin. Markov processes and random fields, Bulletin of Amer. Math. Soc. 3 (1980) 975—999. [2] M. Fukushima, Dirichiet forms and Markov processes (North-Holland/Kodashira, 1980).