Probability Generating Functions

Probability Generating Functions

13 Probability Generating Functions 13.1 ΙΝΤΚΟΟυΟΉΟΝ In the case of discrete random variables, which have positive probability only on the non-negativ...

446KB Sizes 3 Downloads 132 Views

13 Probability Generating Functions 13.1 ΙΝΤΚΟΟυΟΉΟΝ In the case of discrete random variables, which have positive probability only on the non-negative integers, we can construct a probabihty generating function. Definition. If Pr(X = JC) = / ( x ) , x = 0 , 1 , 2 function, G^(r), is

then the probability generating

oo

Gx(t)

= £(r^)=

Σ

f'^/W-

(13.1)

x=0

Example 1 /(x)=l/6,x=l,2,...,6. Gx(t)

=i:i^/6

= (r-r')/[(l -r)6].

1 Example 2 /W

= ( " ) P V - \ η

X= 0,1,...,«.

q=\-p.

Sec. 13.2]

Evaluation of Moments

251

In the two examples above, the existence of the probability generating function (p.g.f.) was not in doubt since only a finite number of terms entered the sum. However, G{t) always exists for U| < 1. This is because for a discrete distribution f{x) < 1 and hence oo

1

which is convergent for |f | < 1. In fact for a proper distribution,

0

and G{t) exists for | / | < 1. Indeed, for some distributions, the radius of convergence will exceed 1. Subject to some restriction on the domain of the dummy variable t, we can say that the probability generating function always exists. This is more than could be claimed for moment generating functions. Problem I. If X has the Poisson distribution with parameter λ, show that Gx{r) = e^-\ Problem 2. If X has the geometric distribution with parameter p , show that Gx(t)=pt/(l - i 7 / ) , | f | < \/q where
13.2 EVALUATION OF MOMENTS A probability generating function can be used to obtain any moments which exist. Thus since a power series may be differentiated term by term within its radius of convergence.

=

Z - [ ^ v w ]

dr

= Σχ''"'ηχ)> at least for |f | < 1. Now to say E(X) exists implies that Σ χ / ( α ; ) is convergent and so E{X)

=

lim f-lL.

(13.2)

252

Probability Generating Functions

[Ch. 13

Example 3 For the binomial distribution with G{/) = (? + pt)", we have G'(t) = np{q+pt)"-'

and

G'(\) = np.

Problem 3. Use the probabihty generating function to calculate E(X) for: (a) (b)

the Poisson distribution with parameter λ, the geometric distribution with parameter p.

Example 4 For the binomial distribution, G(r) = (fl + pt)", hence G'\t) = n{n-\)p'{<:i-Vpt)"-'

and

G'\\)

= n(n -

\)p'.

We can readily find the variance since V{X) = E{X'')-E'{X) = n(n - \)p'

= E[X(X - 1)] + +np-

{npf

= np{\

EiX)-Ε'(Χ) -p).

13.3 SUMS OF INDEPENDENT RANDOM VARIABLES If Χχ, χ, are independent random variables then

If further, Χχ, Χ, are discrete and assume only non-negative integral values with positive probability, an equivalent formulation is Gx^^X^it)

= Gx^it)Gx^(t).

(13.3)

This resuU can be used to identify the distribution of sums of such mdependent random variables. Example 5 Χχ, Χ, have independent Poisson distributions with parameters λ , , X j . Gx^^x^it)

= G;r.(OCA',(r) = e ^ ' - ^

e^''-^

= ς(λ, + λ , ) / - ( λ , . λ , )

But this is the probabihty generating function of another Poisson distribution with parameter λ, + . Problem 4. ΛΊ ,

, . . . X„ are independent random variables each having the

geometric distribution with parameter p. Find the probabihty distribution of η

1

Sec. 13.3]

Sums of Independent Random Variables

253

Problem 5. If Λ'] is binomial η , , ρ, and X^ is binomial « j , ρ show that if X , , Xj are independent then Xi + Xj has the binomial distribution with parameters «1 + «2 and p. Example 6 A fair dice has its sides numbered one to six. If it is rolled twice then the prob­ abihty of a total score of, say five, is easily found by enumerarion. The favourable outcomes are ( 1 , 4 ) , ( 4 , 1 ) , ( 3 , 2), ( 2 , 3 ) , and the required probabihty is 4 / 6 ' = 1 / 9 . If, however the dice is rolled thrice and we seek the probabihty of a total score of twelve, then the method of enumeration becomes more arduous and we might be grateful for the assistance of a probabihty generating function approach. For each roll, «

1 Hence for the total of three rolls, is

Σ

_1

r"/6

1

it-r'' 1 -f,

= f ' ( l -t^Y

(1 - i ) - V 6 ^

The coefficient of f'' is easily found t o b e ( 5 5 - 3 0 ) / 6 ^ = 2 5 / 2 1 6 . Problem 6. If X has the binomial distribution with parameters η and p , find the Umiting form of Οχ(ί) if η -* «> and ρ 0 in such a way that np remains constant and equal to λ. Problem 7. The random variable R can take all non-negative integer values and its probability generating function is G{t) = po + Pit + Pit' (a) (b)

+

...

Show that E{R) is G'(l) and obtain a formula for V(R) in terms of values of the derivatives of G(t). Show that the probabihty that R is an odd integer is [1 - G ( - l ) ] / 2

(c)

An unbiased six-sided die is thrown repeatedly. Show that the probabihty generating function for the number of throws needed to obtain a six is r/(6 - 5i).

254

(d)

Probability Generating Functions

[Ch. 13

Write down the probabihty generatmg function for the number of throws needed to obtain two sixes, Two players throw this die in turn. The player who throws the second six to appear is the winner. By using (b) and (c) or otherwise, find the prob­ abihty that the game is won by the player who made the first throw. Oxford & Cambridge, A.L., 1981.

Problem 8. The probability generating function for the random variable R is G ( 0 = Po + Pi ί + · · • · Find G{\) and show that the mean and variance of R are G ' ( l ) and G " ( l ) + G ' ( 1 ) - [ ^ ' ( 1 ) ] ' respectively. V Qr = ?r{R > r) for all r, show that

If the random variable Μ is defined to be min(/-, .r, . . . ,rk) where r,, are k independent values ofR, prove that Pr(Ai = r) = (<7,.i ) * - ( ? . ) * . If ?r{R = r) = αθ'' where a and θ are constants such that 0 < 0 < 1, find α and Qr in terms of θ and show that (1 -Θ) G(i) = ^ i-et Give as simple an expression as you can for the probabihty generating function for M, where R is as given, find the mean value and the variance of Λί. Oxford & Cambridge, A.L., 1979.

Problem 9. In Chapter 5, we obtained the following differential equations for the random stream of events: Ρ«(0 = λ „ - ι ( Ο - λ ρ „ ( 0 ,

n>0,

p . , ( 0 = 0,

where p«(i) was the probability of just η events in ( 0 , 0 · If G(0=

Σ ΡηίΟθ", n=o

show that dC bt

= λ(θ -

l)G.

Hence show that G = e"*-^'^'>'.

Sec. 13.4]

Use of Recurrence Relations

255

13.4 USE OF RECURRENCE RELATONS Calculating the probabihty of an event by enumerating all the possible sequences of outcomes in a series of trials is not always feasible. Sometimes it is possible to relate the probability of an event on a particular trial to the corresponding probabilities on previous trials. In that case a probabihty generating function may be of great service, though our first examples would scarcely warrant such an elaborate process. Example 7 In a series of η independent trials, the probabihty of a success on each trial is p . It is required to fmd the probability of just r successes by relating this event to the outcome of the first trial. Let be the required probabihty, then since the first trial is a success with probabihty ρ or a failure with probabihty q{= 1 —p) we have, for Μ > 2 Φηίΐ') = PΦn-l(r-

D + (ΐΦη-Λή,

1


If η

Gnit)

Σ

=

Φn(ry,

r=o

multiplying by / and summing r from 1 to «, we have η

α„(0-φ„φ)

η

= tp Σ

Φn-Λr-ly-'

+q

Σ

1 =

Φη-Λ^ΐ'

1

tp G„ -, (Ο + q [G„

(f) -

φ„

.1 ( 0 ) ] .

But φ„(Ο) = <70„-ι(Ο), hence Gn(t)

= ( p f + <7)G„.,(f),

n>2.

By repeated apphcation, we arrive at Gnit)

= (pi + i ) " - ' C , ( r ) .

Evidently,Ci(r) = φχ{0) + φ^{\)t Gnit)

= q + pt and we conclude that

= ipt + q)".

Problem 10. In a sequence of independent trials the probability of a success on each trial is p . By considering the outcome of the first trial show that Grit), the probability generating function of the number of trials required to achieve the rth success, satisfies Grit)=ptGr-,it)

and hence obtain Grit).

+

qtGrit),



[Ch.13

Probability Generating Functions

256

Our next example illustrates more forcibly the power of the method. Example 8 In a sequence of independent trials, the probabihty of a success on each trial is p. It is required to find the probability generating function for the first trial on which the second of two consecutive successes is obtamed. Let φϊ be the probability of this event on trial /, then clearly φι =0,Φι = p' and 03 = qp' where q = 1 —p. When λ > 4, the last three trials must result in a failure followed by two successes, with probability qp'. The previous Ar — 3 trials must not display the event and this is the case with probabihty

1 - 0 , - 0 j .. .-0;t-3-Hence

*-3 Φk =

1-

qp^

Σ

k>4.

We obtain the probabihty generating function, G(t), for the 0*^ by multiplying by i* and summing for > 4. k-3

2

Φkt'' = qp

Σ φ< 1=1

Σ

/t = 4

= qp'

Σ ί * - Σ « / ( 4

= qp'

Σ

U=/+3 =,+3

,·=,

1 -t

- Σ

ν*

Φι

1=1

r

/

1 -/

UKi,

That is G ( i ) - 0 3 p -Φ2ί'

=qp'

l-t

1 -t.

and after collecting terms, G(0= =

p't'(l-pt)/il-t

+

t^qp')

p't'lil-qt-pqt'y

The reader should verify that (7(1) = 1 and that the expected number of trials needed to obtam two consecutive successes is (1 + p)lp'. The next problem is concerned with the nuts and bolts of Example 8 for a particular value of p.

Problem 11. For Example 8 calculate 0, when ρ = 213.

Sec. 13.5]

Compound Distributions

257

13.5 COMPOUND DISTRIBUTIONS Suppose that the number of carriers of a disease has a Poisson distribution and that the numbers of persons infected by each carrier have independent Poisson distributions. Then the total number of persons infected is the sum of a random number of random variables. More precisely let Xi, X^,... be independent discrete random variables and

1 where is an integer-valued random variable, independent of the X,·, then Yis said to have a compound distribution. If the p.g.f. of each Xj is Git), then when Ν = n, the p.g.f. of η

η

ΣΧί

is UG(t)

1

1

=

[Git)]\

Now suppose Pr(A'^ = rt) = φ„, then Eit^)=

ΣE{t^\n)φn = 2[C(i)]>n.

(13.4)

1

But if the p.g.f. of Λ' is //(f), the form of (13.4) is

H[G{t)].

Example 9 Xi has a Poisson distribution with parameter λ and Λ' has a Poisson distribution with parameter θ Gxit)

= e-^*^'

H^(t)

= e-«*«'.

Hence Λ'

1 hasp.g.f.//[G(f)] = ε-β*β1"Ρ(-λ*λί)]

Problem 12. The number of insect colonies in an area has a Poisson distribution with parameter λ. The probability that any particular colony contains just k(> 1) insects is p * ' ' ( l ~ P) where 0 < ρ < 1. If the numbers in different

258

Probability Generating Functions

[Ch. 13

colonies are independent of each other, show that the probabihty generating function of the total number of insects present in the area is glX(i-p)i/(i-pi)I g-λ

Problem 13. Suppose that each Xj, i = 1, 2, . . . has p.g.f. G(f) and Λ' has p.g.f. //(f), then from (13.4),

yv 1 has p.g.f. H[Git)]. (a) (b)

Show that, if they exist,

Ε{Υ) = Ei^T) EiX); ν{Υ)= Κ(ΛΟ E'(X) + E(N) V(X).

13.6 BIVARIATE PROBABILITY GENERATING FUNCTIONS Suppose that Λ',, Χ, have a joint discrete distribution but only assume nonnegative integral values with positive probability. Then their joint probabihty generating function, G(fi, r,), is defined as G(f,,f2) = £-(ff'/f').

(13.5)

It is clear that G(fi, 1), G ( l , fj) are the p.g.f.s for the marginal distributions of XuX2Example 10 Χχ, Χ, have the trinomial distribution with parameters n. Pi.p,. οίΧχ,Χ, is

Χχ\χ2\{η-χχ

-χιΥ.

Ο <Λ:Ι +

Χ,

Thus the p.d.f.

•' •' <η.

and

£-(ff> if Ο

= ; .Σ Σ ,=ο χ " Ο

Χι!Χ2!(«-Χ.

{Pitx)'''{P2t2)''^i\ =

[ Ρ ι Ί +Ρϊ'ι

+ ( 1 -Ρι

-ΧιΥ-

-ρχ -Ρϊ)]"-

We note that G(f,, 1) = [ ρ , f. + ( 1 ~ Ρχ)]" ,G{\, t,) = [p^h + ( 1 - Pi)]" which confirms that the marginal distributions of Λ',, Χ, are binomial with

Sec. 13.7]

Generating Sequences

259

parameters n, ρ 1 andn, p^ respectively. Moreover G{t, i ) = [ ( p , + P 2 ) f + (1 - P i

-P2)]",

and hence the distribution of ΛΊ + Xi is binomial with parameters«, p i + p j • Problem 14. The joint p.d.f of Xi, Xi is /(Xi.xa) = —

—,

0
X2 = 0 , 1 , 2

Show that the joint p.g.f. of Χχ, Xi is ί2) = ε - ' λ - λ ^ - λ ί , ί ,

Hence derive the distributions of (1) X , , (2) Xi, (3)

Χι-Χχ.

Problem 15. Show that the coefficient of α'^β-^ in the expansion of Π(α.β) = e^^'*"*^"''^^^"'^'"'^'*"'^^'^-*^

(1)

in powers of α and β is

oi-u)y''

* p' ( K - p f I —Λ / • where k = min(x, v), and hence that, when X>i>,

μ>ρ,

v>0.

(2)

Π (α, β) satisfies the conditions to be a bivariate probability generating function (p.g.f-). The discrete random variables ΛΓ, y have Π(α,|3) as their bivariate p.g.f., with λ, μ, ν satisfying the conditions (2). Show that the marginal distributions of X and Y are Poisson with respective means λ and μ. Show that the conditional p.g.f. of Y given X = χ is coefficient of a"^ in

1\{α,β)

coefficient of ο·^ in Π (a, 1) 1 + 7 (/3-1) λ

g(M-l.)((3-l)

Comment on this result. Part question. University of Huh, 1978. 13.7 GENERATING SEQUENCES The methods presented in this chapter may be employed more widely. We shall say that H(t) is a generating function for the sequence Οο,Οι, . . Ai

260

Probability Generating Functions

Η(ΐ) =

[Ch. 13

αο+αχί+

and the series converges for some interval |f | < ίο. It is a generating function in the sense that the coefficient of t" is a„. Example 11 (i) Hit) = 1/(1 - i) = 1 + ί + i ' + . . . , | i | < l , s o that 1/(1 - i ) g e n e r a t e s the sequence 1,1,1 (ii) Hit) = 1/(1 - ί ' ) = 1 + f' + i" + . . . , l i | < 1, so that 1/(1 - ί ' ) generates the sequence 1 , 0 , 1 , 0 , . . . . (iii) Hit) = exp(i) = 1 + f + i ' / 2 ! . . . so that exp(i) generates the sequence 1 , 1 , 1/2!, 1 / 3 ! , . . . . Naturally for such general sequences, characteristics such as mean and variance have no relevance. However, there are some sequences related to discrete random variables which can usefully be derived from generating functions.

Example 12 Suppose X has probability generating function G(i) and we wish to find a function Hit) which generates the sequence hix) = Pr(A'>jic). Now ?xiX = x) = hix-\)-hix),

x>\,

hence Git)=

X i ' P r ( Z = x) x=o

= Σ f"" [''(x

- 1) - Λ(χ)] + Pr(X = 0)

1 oo

= t Σ

oo

Σ

'""'Kx-i)-

1

t^'Kx) + P K ^ = 0)

1

= ί Hit) - [Hit) - Λ(0)] + PriX = 0) = (i - l)Hit) = it-\)Hit)+

+ P r ( Z > 0) + Pr(A: = 0) 1.

That is. Hit)=

[l-G(i)]/(l-i).

(13.6)

Problem 16. In Example 12 show that if lim Hit) exists then it is equal to £"(^0. Hence calculate EiX) for the geometric dsitribution.

Sec. 13.8]

A Note on Defective Distributions

261

Example 13 In a sequence of independent trials, the probabihty of a success on each trial is p. Every time a trial is the second of two consecutive successes since the last such event (the sequence being assumed to restart after each such event) we shall say that the recurrent event e has taken place. Thus the sequence S F S S S S . . . displays a first appearance of e on trial 4 and a second on trial 6. We seek the probability, « „ , of e on trial n. Suppose successes occur on trials η — 1 and n, with probabihty p'. Then either e occurs on trial M, with probability M„, or e occurs on trial η — 1 and is followed by a success, with probabihty p. Hence u„.,p

+ u„=p\

n>2

(13.7)

and « 2 = p ' , Ui = 0. We form the generating function Hit) =

Xu„t"

by multiplying equation (13.7) by i " and summing over n>2. This yields oo

ptH{t) + H{t)=p'

Σ t"

=p't'jil-t),

2 or

P't'

Hit) =

,

\t\
(i+pf)(i-r) H(t) may now, after splitting into partial fractions, be expanded as a power series. The reader should verify that the coefficient of t" is PMI + ( - l ) V - ' ] / ( l Problem 17. For Example 13, if trial r, show that

ri>2. is the probabihty that e first appears on

n—i =

Σ r=2

frU„-r+fn,

n>3.

Hence deduce that if G{t) is the probabihty generating funcrion of / j , / s , . . . , then G(t) = H{t)l{\

+//(f)]=p'f'/(l

-qt-qpt').

13.8 A NOTE ON DEFECTIVE DISTRIBUTONS We remarked that for a proper discrete distribution, the probability generating function, Gx{t), satisfies G_y(l) = 1. By proper we here mean that

262

Probability Generating Functions

[Ch. 13

Σ?τ(Χ = χ)^ 1. If this property does not hold, then the distribution of X is said to be defective. Such a feature may be present even in seemingly simple situations. Suppose in a sequence of independent games, player A has probabihty ρ of winning any particular game, and the corresponding probability for player Β is = 1 — p. Player A resolves to quit immediately he has a positive gain on the basis of unit stake per game. It can be shown [1] that the generating function of the probabilities of quitting in an infinite sequence of games is H{t)

=

[l-(l-W')'/']/24i,

//(I) =

il-\p-q\)l2q.

Hence if ρ > q, Ip — q \ = ρ — q and / / ( I ) = 1. If, however, ρ < q, \p—q\ = q — ρ and H(l) = p/q. Thus, if ρ < 1, there is a non-zero probability that player A never obtains a profit.

REFERENCE [1]

W. Feller. An Introduction Vol. I, John Wiley.

to Probability

Theory and its

Applications,

BRIEF SOLUTIONS AND COMMENTS ON THE PROBLEMS Problem 1 1X3

Σ

f^X^e-Vx! = e - ^ 2 ( W ^ ! = e - ^ e ^ ' .

0

Converges for all /. Problem 2 Gx(t)

=

Σ(''α'"'Ρ=Ρ(Σ(~^')'"'

=Ptli\-qt),

1

here the radius of convergence exceeds 1 for 0 < p < 1. Problem 3 (a) (b)

G{t) = exp(X/ - \),G\t) = λ exp(Xr - λ). G ' ( 1 ) = λ. G ( 0 = ptl{\-qt)= {[1/(1 qt)]-\)plq.Wenct G\t)= pl{\ - < 7 f ) ' , h e n c e C ' ( l ) = p / ( l -qf = 1/p. Similarly, differentiating G{t) twice, G"{t)=

Σ

x{x-^)t''-^f{x)

\qt\<\,

Brief Solutions and Comments on the Problems

263

and if ^ ( Χ ' ) exists, then G"(f) =

-l}f(x)

=

E[XiX-l)].

Problem 4. The p.g.f. of each Xj is pf/(l —Qt). Hence the p.g.f. of η 1

is [pf/(l - qt)] " , If | < l/q. We are to recover Pr L 1

That is to say, we need the coefficient of as a power series. Since |ί7ί | < 1, we have (1 -qt)-"

= l+nqt+

in the expansion of p"t" [1 — qt]'

"(« + 1 )

, , q't' . . .

η{η + I) . . .(n + k-l)

^^

and the coefficient of t'''" in this expansion is n{n+l)...in

+ r-n-l)

_„ ^ / r -

(/·-«)!

l\

\ « - l /

Hence r = η,

PTob\em

S. Gx^(t)Gx^it)

= {q + pt)"'

{q + pt)"'

Μ+

={q+pt)"'*"'

the required form.

Problem 6 Gxit)

= {q+

pt)"

=

1

1 +

λ

λ + -f η η X(f-l)

>λί-λ

The limit is the p.g.f. of a Poisson distribution with parameter λ.

1,. . . .

which is of

264

Probability Generation Functions

[Ch. 13

Problem 7 Ans. (c) //(i) = [i/(6 - 5 0 1 ' . Hint. Probabihty of first six on the rth throw is (5/6)'' ' (1/6). (d) (1/2) [ 1 - / / ( - I ) ] .

Problem 8. Hint. Pr(M = r) = Pr(Ai >r)-

a = (l-θ)/(1 -a/),

?τ(Αί>

r + I)

e'*\

qr =

Required p.g.f. (1 - 0 * ) / ( l - 0 * f ) . Problem 9. Compare coefficients of 0 " both in dG/dt and λ(θ - l)G and check that they are equal in virtue of the differential equation. Write the partial differential equation as f

[logeG(i)] = X ( 0 - l ) ,

at

and integrate with respect to time. Problem 10. If φr(n) is the probabiUty that the nth trials is the rth success, then either: (a) (b)

the first trial is a success and there remain (n — 1) trials to reach a further (r — 1) successes; or the first trial is a failure and all r successes must be obtained in (« — 1) trials. Thus (/>;·(«) = ρφΓ.ι{η — 1) + i70r(n — 1) for 2 < r < n. Multiply equation by r " and sum η from r to «> G,(f)

=

ptGr.x{t)

+

Gr{t)=

qtGr{t),

G,.,(r)

1

- q t

V-'

pt

1-W by repeated apphcation. But G,(f) = p i + = pi/(l so that GM=[ptl{\-qt)V.

ί / ρ ί ' p i * . . . ~qt),

Brief Solutions and Comments on the Problems

265

Problem 11. From Example 8, 4f' G(i) =

( 3 - 2 f ) (3 + f)

9 4r'

+

3-2/ 2

/

3

V

3+ /

_

2t\"

-1

+ -

3

3 /

ll + - ,

V

3/

We require the coefficient of t', which is

V3

3

V

i>2. 3/

Problem 12. If Λ' is the number of colonies, then the total number of insects is zero if Λ' = 0. I f A ' > 0, the total number of insects is Λ' 1

where Χ, is the number in the ith colony. C(i) = f(l - p ) / ( l - pt). Now Pr(A' = «) = \"e'^/nl, and hence the required p.g.f is oo

e-^ + Σ

W -P)I0

-pt]"\"e-^/nl

1 = g \ ( i - p ) r / ( i - p f ) g-λ

Problem 13. liKit) = H[Git)], then Κ '(ί) = //'[G(r)]G '(f), hence / : ' ( 1 ) = / f ' [ G ( 1)] G'(1 ) = / / ' ( 1) G'(1 ) = £·(Λ0 ^ ( Τ ) · Differentiating a second time with respect to f K"it)

= H"[Git)][G'it)Y

+H'[Git)]G"it),

K"i\)

= .^"(1) [G'(1)P + ^ ' ( 1 ) G " ( 1 )

hence

= Ε [NiN - 1))] [EiJQ] '+EiN)E But A"'(l)

=£'[r(y-

[XiX - 1 ) ] .

1)]. After collecting terms and using

Vi^f) = Ε [NiN - 1)] + EiN) - [EiN)]' and similarly for ViX), we obtain the result.

266

Probability Generating Functions

[Ch. 13

Problem 14

X, = 0 X j = X , Xi · ν ^ 2

^i!

x,=o

Χ^'·

( x , = x ,

(Jf2

- ^ i ) !

j

X, X, = 0

_

-"^1

I

g - 2 λ + λ Γ , + \ f j f.

Now G ( i i , 1) = e"^*^'' and this is the p.g.f. of a Poisson distribution with parameter λ. Also G ( l , ^ 2 ) = e " ' ^ * ' ^ ' ' , another Poisson distribution but with parameter 2λ. FinaUy we put r, = l / i j when £"(if · i j ^ ' ) = £ ( r f > )· But G ( l / f 2 . ' 2 ) = e-''*^'^. Hence X2 — X , has a Poisson distribution with para­ meter λ.

Problem 15. Π(α, β) = ^'^'^^-"*"»)

coefficient

of a* in

JC!

in which the coefficient of β^ is

χ! /=o

\lj

and hence result. Π(α, 1) = e^^""'^ and X is Poisson parameter λ, while Π ( 1 , β ) = e^^^"'^ and 7 is Poisson parameter μ. Conditional p.g.f. of ^ given χ is of the form ΣΡΓ(Α' = Χ and Y = y) β>'Ι?ι{Χ = x) = Σ(coefficient α*|3->')/3-*'/ y coefficient of in Ti(a, 1) and the numerator is the coefficient of α·* in Π(α, β). Problem 16. If hm//(r) exists then so does lim [1 — G(i)]/(1 — t) which is f-*i r—i G\l) = E{X). Hence 00 //(I)

=

Σ 0

m-

Brief Solutions and Comments on the Problems

If ?i{X = x) = q''-'p,

267

then Pr(A' >x) = q'' and

x=o

Problem 17. If e occurs on trial n, then if it appeared on trial r for the fust time it is observed again (n —r) trials later (2 < r < η — 1). However, trial η may also be the first occasion. Hence the result follows. Multiply the equation by t", sum over n> 3, and interchange the order of summation. Thus

Σ η= 3

Σ

("Σ

r=2

H(t)-p't'

frUn-r)

η = 3 \ r=2

(.

= G{t)Hit)

ί" + /

Σ

n=r+i

+

fnt",

/ϊ = 3

'

G(t)-p't\

to obtain the result. Finally substitute for//(i) from Example 13. Compare with the solution in Example 8 for the same problem.