A limit theorem for the size of the nth generation of an age-dependent branching process

A limit theorem for the size of the nth generation of an age-dependent branching process

JOURNAL OF MATHEMATICAL ANALYSIS AND 15, 273-279 (1966) APPLICATIONS A Limit Theorem for the Size of the nth Generation an Age-Dependent Branchi...

268KB Sizes 0 Downloads 88 Views

JOURNAL

OF MATHEMATICAL

ANALYSIS

AND

15, 273-279 (1966)

APPLICATIONS

A Limit Theorem for the Size of the nth Generation an Age-Dependent Branching Process

of

ANDEFS MARTIN-Lij~ The Institute for Optimization and Systems Theory, The Royal Institute of Technology, Stockholm, Sweden Submitted

by Richard

Bellman

DEFINITION OF THE PROCESS

The process starts with one particle at the zeroth stage at time zero. After a stochastic time T this particle reaches the first stage, where it immediately produces a number of new particles, which in their turn pass to the second stage and produce new particles etc. The probability that a particle produces k new ones is p, , and the probability that T < t is F(t). Each new particle has the same probabilities p, and F(t), and the different particles behave independently once they are born. The above probabilities are supposed to be independent of time, the age of the particles and the number of particles present. This is the definition of an age-dependent branching process as given, e.g., by Harris (1963), where many of its properties are derived. The quantity of interest in this paper, the number of particles having arrived at the nth stage at time t, is called G,(t). The following functions are introduced:

the generating function of p, . It is supposed that P’(1) = m and P”(1) = d exist; i.e., the first two moments of p, are finite, and that the mean of the number of descendants m > 1. The variance of this number is then equal to d + m - m2. The moment generating function of G,(t) : E(exp (- sG,(t))) defined for Res 2 0 is called Qn(s, t). The probability of G,(t) = k is called pn(k, t). For t >, 0, we then have qo(L 6 = 1,

41(0, t> = 1 -F(t),

q1(1, t) =w.

To obtain a convenient notation we deiine

4&4 t) = 1

for 213

t < 0.

274

MARTIN-LijF

We thus have for t > 0, Q&, t) = F(t) exp (-

Qoo(s,t) = exp (-- 4,

4 + 1 -F(t),

and for t < 0 Qn(S, t) = 1. The mean and the second moment of G,(i) are given by

E(G,(t)) = - “Qat’

E(G,s(t)) = “Q;;f’

and

t, = m,(t)

RECURRENCE FORMULA

FOR

t, = d,(t).

Qn(s, t)

To obtain a formula for Qn(s, t) we consider the value of G,(t), given that the first particle arrives at the first stage at time u and creates k new particles there. In this case G,(t) is the sum of k independent cascades all distributed as Gnml(t - u), and thus its moment generating function is QtP1(s, t - u). Multiplying with p, and summing, we obtain the moment generating funct - u)), and integrating with respect to dF tion of G,(t) given u : P(Q&s, we finally obtain

Q&

t) = j:

p(Qnn-&,t

- 4) dF(4

71> 1

+ 1 -F(t),

or shorter :

Q,&, 4 This equation the derivatives

z

= j,”

P(Q,&,

t - 4) dF(4.

succesively determines Q%(s, t) when Qr(s, on both sides formally, we obtain

t) is

given. Taking

aQn-1 -8QTl =stP'(Qn-1) T dF j

= j; ip,;;-~

(&

+ P”(Qnml) (w,“,

dF.

If the integrands on the right-hand side are bounded, we see by the bounded convergence theorem that aQ,/as and asQ,Jasz exist and are uniformly bounded for all t. But Qr has bounded derivatives. Thus by induction we see that the first two derivatives of Qm exist and are uniformly bounded for all t. Thus the first two moments of G,(t) are given by the formulas

mn(t) = m t m,-,(t - u) dF(u), 10

ml(t) = F(t)

d,(t) = s” (md,-,(t - u) + dm:Jt

- u)) dF(u),

0

The first formula

gives m,(t) = mn-lFn*(t).

d,(t) = F(t).

LIMIT THEOREMFOR A BRANCHING PROCESS

275

THE COVARIANCE FUNCTION For the further study of G,(t) it is of interest to know the covariance function C&,

t) = E(G&)

Suppose that j >, i. Given that n particles time u we have

G,(t)). are created at the first stage at

k=n

G&)

=

2

Gjt;(s

-

u)

G;!,(t

-

u),

k=l l.=n

G,(t)

=

2 Z=l

where the terms in the sums are independent and all have the same distribution as G&s - u) and G,-r(t - u), respectively. They are the contributions from the n different cascades started at the first stage. Thus we have G&) G,(t) = ‘z

Gj:;(s

- u) G;?l(t - u) + 2

k=l

Gl”_‘,(s - u) GEl(t

- u).

kfl

For the expectations

we obtain,

given n and u,

E(G,(s) G,(t)) = nE(Gi-,(s - u) GiJt

- u))

+ (n2 - n) E(Gi&

- u)) E(Gjel(t - u)).

Given u, we have E(G,(s) G,(t)) = mCi,j-l(s

- u, t - u) + dmi,(s

- u) mjml(t - u).

And, finally, Cij(S, t) = m 1; c,~lj~l(s

- u, t - u) dF(u) +

Cij(s, t) = 0

&,.++j-4

if

gj7(i-1)*(S I 0

s

_

or

u)F(C-l)*(t

_

u)&7(u)

t
and c0j(s9 t, = dS) mj(t), where e(t) = 1 if t 2 0, and c(t) = 0 if t < 0. This recurrence formula be written as a two-dimensional convolution in the following way:

can

Cij(s, t) = m ,, Jr C,-,j-,(s - u, t - v) dF(u) da(v - u)

+ &i+j-4

m =%‘(“-l)*(s - u) F’j-l’*(t SI0 0

- v) dF(u) d+

- u).

276

MARTIN-LiiF

Now Cii(s, t) is a nondecreasing function of s and t because G,(t) is nondecreasing and nonnegative. It is also bounded, as C$(s, t) < di(s) d,(t) < const by Schwarz’ inequality. Therefore, we can introduce the moment generating functions

e&J, q) = 1: 1; e-(ps+Qt) dC& t) and P(p) = Jr e--ysdF(s) and write the recurrence formula for these: C,&, q) = m&&, ^ Co& q) = &‘lyq).

q) P(p + q) + d?rw4Pi-l(p)

W(q)

I’(p + q)

It can be verified by induction that the solution is given by Qp,

q) = mj-@ (P + !?PM + d&

+ q) (m+(p mV&

THE LIMIT

Because E(G,(t)) = mn-P*(t), the mean of the variable

+ q) I’j-i(q) - m”+@(p) $9(q)) + 9) - mficPm7))

FORM OF G,(t) w e see by the central limit theorem that

-G(u>= m-n+lGn(nr + uu d/n) conveges to a(~), the normal distribution 7=

m t dF(t), s 0

.

u2 =

function, as n-+ co. Here s

; (t - 7)” dF(t)

are supposed finite and (T > 0. We shall see that Xm(u) converges in mean when n + 03. The covariance function of X,(U) is given by Hii(u, v) = E(Xi(u) X,(v)) = m-i-j+2C,j(s, t),

LIMIT

THEOREM

FOR A BRANCHING

PROCESS

277

where s = i7 + ua 47 Therefore llij(p,

its moment

q) = m-i-j+2

generating

and

t=j7+eraG.

function

is given by

ex

)

- m2Pi ($jP’(&jj + ??a2P

i

i

-&+A)

--q-$jq&jj

Because m > 1 the first two terms converge to zero as i, j -+ 0~). The last term by the central limit theorem converges to

d&(P) &I) m(m Hence we see by the continuity that I&,(u, w) converges to

theorem

Mu) @(4 m(m -

1) ’

1)

as

for moment

i, j-+

generating

functions

co.

From this fact follows that

converges to zero as i, j --t Co. Thus we see that X,(U)/@( u ) converges to some stochastic variable in mean square. This variable, X, is independent of u. To obtain a formula for its moment generating function, g(p) = E (exp (-9X)), we first introduce that of Xi(u), &, 4 = E(exp (- P&W). We have gi(p, u) = E (exp (= Qi(pm-ifl,

pm-i+lGi(iT in + ucr 16).

+ uu d)))

278

MARTIN-L6F

For the following

expression we obtain, using the formula for Qi(s, t),

= Qi(pm-i+2,

~(3 z/i--

1 +(i-1)7)

= jw P(Qi-I(pm-2+2 , uu di

+ (i -

1) 7 - v) dF(v)

0

We shall see in the next paragraph that gi(p, u) converges uniformly in every finite u-interval towards g(@(u)p). Using this fact and the fact that P in the integral is uniformly bounded, we get, letting i-t co in the above relation, &P) = PMP)). In th e ordinary Galton-Watson process, i.e., this process when the transit time is not stochastic, the limit of m-i+lG, , where Gi is the number of particles arriving at the ith state, exists, and its moment generating function is uniquely determined by the above relation together with the condition g’(0) = - 1 [l]. Thus we know that X has the same distribution as this limit, since E(X) = - g’(0) = 1. Summing up, we have thus obtained the following result: THEOREM.

X,(u)

= m-“+lG,(nr

+ uu d/n)

converges in mean to the stochastic variable CD(U)X, whete CD(U)is the normal distribution function, and X is a stochastic variable distributed as the limit of rn-“+l times the generation of the ordinary Galton- Watson process. This result has been formulated

as a conjecture

Proof that g,(p, ZJ)converges uniformly (putting @p(u)X = X(u)) I g&,

4 - g(W4

P) I = I E (exp (-

in Harris

in every finite u-interval.

P-G(~)

< E I exp ( - ~&(4) < 2 I P I E I X&4

- exp (- X(u) I

I~~~~-P~~--~~~-P~~I~~lPll~-~l for and

We have

- exp (-- PXWN

because

Rep>0

[l].

x, y 3 0.

P-Q>)

I !

LIMIT

THEOREM

FOR A BRANCHING

279

PROCESS

We consider an arbitrary point uO, and take u” < 240< d. If u” < u < u’, we have I X,(u) - X(u) I < I &(u’)

- X(u”) I + I -G(u”) - X(4

I

because X,(u) and X(u) are both nondecreasing, and so

I X,(u) - X(u) I < I &W

- X(4 I + 2 I -w’> - -Q”) I + 1X&4”) - X(d) 1.

By Schwarz’ inequality we have E 1w 1 < 1/E(w2), and thus E I X&4

- X(4 I < v’ WW’)

- -W’N2

+ dE(X,(u”)

- X(U”))~ + 2 dE(X(u’)

- X(d2)).

The last term can be made small by taking u’ and u” sufficiently near to one another, as the covariance function of X(u) is continuous. The first two are small if we take n large enough. Thus we see that I g&, u) - g(@(u) p) 1 is uniformly small in a neighborhood of u,, . Finally, by the Heine-Bore1 theorem, we see that it is uniformly small in every finite u-interval.

1. T. E. HARRIS. “The 1963.

Theory

of Branching

Processes.”

Springer-Verlag,

Berlin,