Essential uniqueness of optimal estimating functions

Essential uniqueness of optimal estimating functions

Journal of Statistical Planning and Inference 17 (1987) 405--407 North-Holland ESSENTIAL UNIQUENESS OF OPTIMAL FUNCTIONS 405 ESTIMATING B.K. KALE ...

154KB Sizes 2 Downloads 51 Views

Journal of Statistical Planning and Inference 17 (1987) 405--407 North-Holland

ESSENTIAL UNIQUENESS OF OPTIMAL FUNCTIONS

405

ESTIMATING

B.K. KALE University of Waterloo, Waterloo, Ontario, Canada and University of Poona, Pune, India Received 5 November 1986; revised manuscript received 30 December 1986 Recommended by V.P. Godambe

Abstract: Godambe (1976) has shown that if gl and g2 are both optimal estimating functions for estimating 01 in presence of 8 (2), a nuisance parameter, then their standardized versions are the same. We prove this result for vector valued estimating functions for estimating 0tl)= (01,..., 0k)' in presence of 8 (2) and in the process provide a simpler proof of Godambe's (1976) result for k = 1.

AMS Subject Classification: 62F10. Key words and phrases: Interesting and nuisance parameters; Matrix optimality criteria; Standardized USEF.

Introduction

Let X be a random vector on X with pdf p(x, 0) w.r.t, tr finite measure p on (X, B) and indexed by a Oe D CR"'. Let 0(I)= (01,..., Ok)' be the (vector) parameter of interest and 8(2)=(0,+~,...,8m)' be the nuisance or incidental parameter. Let = ~¢'~1X ~r'~2 where ~r~1 = {0(I)}, and ~r~2 = {0 (2) }. Let g : x × ~2~~R* be an Unbiased Statistical Estimation Function (USEF) such that E(g)= 0 for all 0 e D. We will assume that { p(x, 0), 0 e D } satisfies the usual regularity conditions and denote by Gk the class of all regular USEFs. For a statement of regularity conditions on G, we refer to those of Ferreira (1982) and Chandrasekar and Kale (1984) which are generalizations of conditions assumed by Godambe (1976) for the case k-- 1. Optimal USEF g*e G, is defined by the matrix-optimality criteria, namely Mg, - Mg., is nnd for

VO ~ D, Vg ~ Gk,

(1)

where gs=Dglg is the standardized version of g, with Dg denoting the matrix [E(Ogi/OSj)]k×k, and Mg, denotes the variance covariance matrix of gs. For the case k = 1, Godambe (1976) proved the essential uniqueness of the optimal USEF by showing that if g and h are both optimal in GI then gs = hs. However the proof given by Godambe is somewhat complicated and the method of proof can not be extended easily to the case k > 1. Ferreira (1982) and Chandrasekar and Kale 0378-3758/87/$3.50 © 1987, Elsevier Science Publishers B.V. (North-Holland)

406

B.K. Kale / Optimal estimating functions

(1984) proved the essential uniqueness of the optimal USEF in a sub-class of G~ by assuming additional conditions. Thus Ferreira assumed that E(yg')=0 for any g~ Gx and any 7 a k-dimensional vector SEF from ,Z × £2-+R k. Chandrasekar and Kale (1984) on the other hand assumed that [E(Ogi/OOj)]k×k= [d6] is such that d o depends only on 0 tl). Since Godambe's (1976) result for k = 1 is proved without such extra conditions it is worthwhile to see if his result can be extended for k > 1 without additional conditions. The object of this note is to prove the essential uniqueness of the optimal USEF in Gx without extra conditions. Specializing our proof for k = 1 we get a much simpler proof of the Godambe's (1976) result for k=l.

Uniqueness of optimal USEF For any g ~ Gk the standardized version gs=Dglg is of the type A(O)g where A(O) is a non-singular matrix of elements aij(8) and 0 = (00), 0 (2)) e D. Corresponding to the class Gk we define a class G~' containing Gk such that G~' = {A(O)glg ~ Gk, A ~ A} where A denotes the class of all non-singular matrices [aij(O)]k×k and aij(O) are differentiable functions. Note that by taking A(O)=le×k, we have GkCG~; and similarly for any g ~ Gx, gs ~ G~. Further since E(g)= 0 we can show that (gs)s = gs or Gff is closed under standardization. Note that Gk is not closed under standardization. We first prove the essential uniqueness of an optimal USEF in G~' where the definition of optimality is the same as in (1) with Gk replaced by Gff.

Theorem. I f g ~ G~ and h e G~ are both optimal in G~, then gs = hs. Proof. Since g and h are both optimal in Gff it follows that M g , - Mh~ and Mh~-Mg, are both nnd or

Define u = +-(gs+ hs) then u e Gff and since g is optimal, Mu,- Mg, = ¼[N+ N ' - 2 M g j is nnd. Here N = E(gshs). But this implies that - [Mg, + Mh, - N - N'] = -M(g,_ k,) is nnd. F r o m this it follows that M~g,_h,)=0, or gs=hs (a.s. for each 0). !

Corollary. I f g E Gk and h ~ G k are both optimal in Ge then they are optimal in Q*

and gs = hs. Proof. If u e G g and u I =A(O)u for some A c A then Uls=Us and Mu~=Mu ,. Since g is optimal in Gk, Mg.,-Mu, is nnd for any u ~ Gk, from which it follows that Mg-M,,., is nnd for any ul e G [ since every u 1~ G~ can be represented as A(O)u for some u ~ G/~. Thus g is optimal in G~. Similarly h is optimal in G~' and by the above theorem gs= hs, or the optimal USEF in Gk is essentially unique.

B.K. Kale / Optimal estimating functions

407

Note that the above proof goes through for k = 1 and provides a simple proof of Godambe's (1976) Theorem 2.1. This result also settles the question raised in Remark 5.1 in Chandrasekar and Kale (1984).

Acknowledgements The author wishes to thank Professor M.E. Thompson for some helpful discussions and National Science and Engineering Research Council of Canada for providing financial support.

References Chandrasekar, B. and B.K. Kale (1984). Unbiased statistical estimation functions for parameters in presence of nuisance parameters. J. Statist. Plann. Inference 9, 48-54. Ferreira, P.E. (1982). Multiparametric estimating equations. Ann. Inst. Statist. Math. 34, 423-431. Godambe, V.P. (1976). Conditional likelihood and unconditional optimum estimating equations. Biometrika 63, 77-84.