Bootstrap tests for generalized least squares regression models

Bootstrap tests for generalized least squares regression models

Economics Letters North-Holland 261 34 (1990) 261-265 Bootstrap tests for generalized least squares regression models Robert K. Rayner Penn State...

313KB Sizes 5 Downloads 141 Views

Economics Letters North-Holland

261

34 (1990) 261-265

Bootstrap tests for generalized least squares regression models Robert

K. Rayner

Penn State Erie, Erie, PA 16563-1400, Received Accepted

USA

5 February 1990 21 March 1990

This paper shows that certain statistics in the normal with faster rates of convergence than the usual normal the regression coefficients is shown to produce errors statistic gives errors of smaller order than n-*. An asymptotic accuracy without the necessity of explicitly

linear regression model may be bootstrapped to obtain test procedures approximation. Specifically, bootstrapping a standardized estimator of a variance-adjusted with smaller order than n-‘, and bootstrapping advantage of the bootstrap is that it provides an additional step in calculating correction terms.

1. Introduction In the linear regression model the coefficients are often estimated by a two-step procedure in which the error precision matrix - assumed to depend on a vector 8 of unknown parameters - is estimated first, and then used to obtain generalized least squares estimates of the regression coefficients. The resulting estimates are asymptotically normally distributed. Recently, Rothenberg (1984a,b) derived an asymptotic expansion for the distribution function of the standardized two-step estimator. His result, which is based on the theory of stochastic expansions first suggested by Nagar, covers the class of estimates of 19 commonly used in econometrics. The theory of stochastic expansions is closely related to the Edgeworth and Cornish-Fisher expansions often used to obtain approximations for univariate statistics. Rothenberg’s papers are important in part because they provide some regularity conditions that ensure validity of the expansion. Building in particular on his work, I show that the bootstrap may be used to estimate p values and critical values for hypothesis tests on the regression coefficients in normal linear regression models, with errors of smaller order than n -l. Further, it is shown that Rothenberg’s variance-adjusted statistic may be bootstrapped to obtain test procedures with errors of smaller order than n-*. A resampling method introduced by Efron (1979), the bootstrap is a computer intensive approach which uses simulation to achieve the desired approximations. Related results in the case of iid random variables were proved by Hall (1986) and Beran (1988), among others.

2. The normal linear regression model Consider

the linear

model

y=Xjl+e, 0165-1765/90/$03.50

0 1990 - Elsevier Science Publishers

(1) B.V. (North-Holland)

R.K.Rayner / Bootstrap tests

262

relating the n-dimensional random vector y to the n x k nonrandom matrix X. The vector p is a k X 1 vector of unknown regression coefficients, and e is normally distributed with zero mean and nonsingular precision matrix D = [E(ee’)]-‘. It is assumed that the elements of 52 are known functions of the p-vector 8. Here we are concerned with the problem of testing H, c’j3 = c’& versus Hi: c’p > c’&, where c is a given k x 1 vector and c’&, is a given constant. Suppose that the estimate 4 does not depend on j3 and is an even function of e. (This assumption is met by the commonly used estimation methods for 0.) In addition, suppose that 6(6-0) is asymptotically normally distributed with zero mean and covariance matrix A, and that 6 is asymptotically efficient, so that A-‘(O) has typical element x’j = tr(S2-‘9,s2-‘Qj)/2n, where 9, is the n x n matrix %2(8)/M, evaluated at 8. Consider first the statistic U,, defined as

u, = (c’p^ - c’&J)/qp where p^ = (X’bX)-‘X’$y, d = Q(8), and u$p = var(c’B) (X’OX)-‘X’Gy. Under H, this equation may be rewritten as

= c’( X’OX))‘c,

with

U, = c’( X’dX)-lX’Sie/uct~.

(4

The second statistic that is investigated uses a variance correction covariance matrix in the equation for p^ into account. Let the p X p matrix b,.j =

c’( X's2X) -1X'0,MO-'M'f21X(

p =

that takes the estimated B( 8) have typical element

X'QX)

M=Z-X(X%X)-'X'LZ. Define W, = c’( X’bX)-‘X’C?e/u,,

(3)

where u,’ = c’( X’&?X)-‘c[l + trAB/n]. Rothenberg (1984a) has shown under reasonable regularity that the distribution function of U, has the expansion Pr(U,
conditions

(see his Assumptions

@(~)+(A,(~,e)/n)+(x)+o(n-'),

l-5)

(4)

where @ and 9 are the distribution function and density of the standard normal distribution, and A,(x, 0) is a smooth function of x and 8. Further, by deriving the o(n-‘) term in (4) he showed that W, has the expansion (5)

Note that (5) provides substantial has an error of order n-i.

improvement

over the usual normal

approximation,

which from (4)

263

R.K. Rayner / Bootstrap tests

3. Bootstrapping

generalized

least squares test statistics

Let (y, X) be given fixed data from (1) and suppose that 8 is the computed estimate of 8. Notice that for estimation purposes the data may also be represented by (e, X) because of the assumed independence of 8 and fi. The bootstrap procedure - denoted by starred quantities - treats X and 8 as known information, and involves resampling from the population with those characteristics. Specifically, let e* be normally distributed with zero mean and precision matrix h. Let 6* be the bootstrap estimate of 6, method which was used to get 8 from (e, X). computed from (e *, X) using the same estimation Then the bootstrap statistic U,* is defined by

where 6* = S(&* ) and S& = c’( X’bX)-‘c.

Similarly,

W,* is defined

by

(7)

W,* = c’( X’Si*X)-lX’Si*e*/s,,

sz n = c’(

+ trAh/n],

X'dX)-lc[l

Let u, and w,, be the observed

theorem

and

p(w,,)

gives the accuracy

‘Theorem 1. Suppose that the regularity expansions (4) and (5) are valid. Define p*(u,)

= Pr(U,*

P”(%)

=P(%)

and

s=B(@.

value of U,, and W,, and define

P(u,> = Pr(U,> 4) The following

A=A(ff),

> un),

and

= Pr(W, > wn).

of the bootstrap conditions

p*(w,)

estimates

discussed

of p values.

in section

2 are satisfied,

so that the

= Pr(W, > w,).

Then + o,(n-‘)>

p*(w,)

=p(wn)

+ o,(n-‘),

(8)

so that for (Y in (0, l), Pr[p*(u,)
=a+o(n-‘),

Pr[p*(w,)

Proo$ Recall th:t for the bootstrap, the precision matrix 9. Thus, from (4) we have Pr(U,* Ix)

= Q(x)

+ (A,(x,

underlying

&/n)+(x)


=cu+o(C2).

population

+ op(n-l).

is normally

(9)

distributed

with

264

R.K. Rayner / Bootstrap tests

Consequently,

(Al(“n, &++#4%>+ OpW’)

= 1- @(u,) -

p*(u,)

with the second equality Pr[ p*(u,)

following
from the assumed

= U(a)

+ 0(n-‘)

consistency

of 6. Then

= (Y+ o(n-‘),

where U is the uniform distribution on (0, 1). The proof for p * ( w,) is similar. In practice, the bootstrap estimates of p values can be approximated as accurately as desired by simulation. For example, let u$, . . . , u:,~ be the computed values of U,,* from b bootstrap samples (i.e. b random draws for e*). Then p*( u,,) may be estimated by #{u,* > u,}/b. Values of b on the order of 1000 usually provide sufficient accuracy. Alternatively, one may be interested in testing H 0 : c’/3 = c’& by using critical values instead of p values. To get the bootstrap estimates of critical values for a size (Y test, let (y, X) be given fixed of U,* and data from (1) and define u& and w,,*=to be the (1 - cu)th quantiles of the distributions 0 W,*. Let u,,, and w,,~ be the true critical values. Theorem 2. Suppose that the regularity conditions discussed in section expansions (4) and (5) are valid. Then the bootstrap estimates approximate errors of smaller order in probability than n - ’ and n - 2. Thus, Pr(U, > u&) Proof

= U * n,c

z -

Pr(W, > wzc) = (Y+ o(n-‘).

= a! + o(n-*),

Let z = @-‘(1 - CN).From

the definition

Q/n)

(A,(z,

2 are satisfied, so that the the true critical values with

+ o,(n-‘)

of the bootstrap, = u,,,+

(10)

we have

o,(n-‘).

Therefore, Pr(U, >

u&)

=

Pr(U, > u,,,)

+ o(n-‘)

The proof for w,,!~ is similar.

0

As with p value estimation, desired by simulation.

bootstrap

critical

= a + o(n-‘1.

values

may

be approximated

as accurately

as

4. Conclusions Bootstrapping a standardized estimator of the coefficients in normal linear regression models provides estimates of p values and critical values for hypothesis tests with errors of smaller order than n-‘. Further, bootstrapping a variance-adjusted statistic provides tests with errors of smaller

R.K. Rayner

/ Bootstrap

tests

265

from order than n -2. This n-* step in accuracy is somewhat unusual, since the typical improvement accuracy of the variancecorrected statistics is of order n -‘I2 . To achieve the increased asymptotic adjusted statistic, it is necessary to first derive the variance correction term from theoretical considerations. In many cases, that is reasonably easy to do. Alternatively, a double bootstrap (which amounts to simulating simulations) could be used, along the lines of Beran (1988). It should be noted that U, and W, generally can not be used for the construction of confidence intervals for c’p. This is because in most applications the 6 parameters are unknown, so that U, and W;, are usually not pivotal [Hall (1988)]. See Rayner (1988a) for details on bootstrapping ‘Studentized’ statistics that can be used for the construction of confidence intervals. Also see Rayner (1988b, 1989) for some results on bootstrapping sets of linear restrictions in the normal linear regression model. As a final point, in some applications one might want to dispense with the assumption of normally distributed errors. In that case, one could use a non-parametric bootstrap to estimate the distribution of U,, by sampling with replacement from the residuals. The adjusted statistic, however, would take a form different from W,.

References Beran, R., 1988, Prepivoting test statistics: A bootstrap view of asymptotic refinements, Journal of the American Statistical Association 83, 687-697. Efron, B., 1979, Bootstrap methods: Another look at the jackknife, The Annals of Statistics 7, l-26. Hall, P., 1986, On the bootstrap and confidence intervals, The Annals of Statistics 14, 1431-1452. Hall, P., 1988, Theoretical comparison of bootstrap confidence intervals, The Annals of Statistics 16, 927-953. Rayner, R.K., 1988a, Bootstrapping test statistics and confidence intervals for regression models, Unpublished manuscript. Rayner, R.K., 1988b, Bootstrapping test statistics and linear models with a nonscalar error covariance matrix, Unpublished manuscript. Kayner, R.K., 1989, Bartlett’s correction and the bootstrap in normal linear regression models, Economics Letters, forthcoming. Rothenberg, T.J., 1984a, Approximate normality of generalized least squares estimates, Econometrica 52, 811-825. Rothenberg, T.J., 1984b, Approximation the distributions of econometric estimators and test statistics, in: Z. Griliches and M. Intriligator, eds., Handbook of econometrics, Vol. 2 (North Holland, Amsterdam).