On bootstrap and analytical bias corrections

On bootstrap and analytical bias corrections

Economics Letters 58 (1998) 7–15 On bootstrap and analytical bias corrections a b, Silvia L.P. Ferrari , Francisco Cribari-Neto * a ´ ˜ Paulo, Caixa...

66KB Sizes 3 Downloads 324 Views

Economics Letters 58 (1998) 7–15

On bootstrap and analytical bias corrections a b, Silvia L.P. Ferrari , Francisco Cribari-Neto * a

´ ˜ Paulo, Caixa Postal 66281, Sao ˜ Paulo /SP, 05315 -970, Brazil , Universidade de Sao Deptartamento de Estatıstica b Department of Economics, Southern Illinois University, Carbondale IL, 62901 -4515, USA Received 29 May 1997; accepted 1 October 1997

Abstract This paper explores the relationship between bias correction of maximum likelihood estimators through bootstrap and analytical series expansions. It shows that the bootstrap provides a natural way of defining finite sample bias-corrected maximum likelihood estimators in one-parameter models. Bootstrap-corrected estimators for multiparameter models are obtained. Simulation results comparing the finite-sample performance of bias-corrected maximum likelihood estimators based on the bootstrap and on analytical series expansions are also given.  1998 Elsevier Science S.A. Keywords: Bootstrap; Bias correction; Maximum likelihood estimation JEL classification: C13

1. Introduction It is well known that maximum likelihood estimates can be biased and there have been a number of attempts to develop bias correction techniques that can be applied in such cases. One approach consists of obtaining a closed-form expression for the bias of the maximum likelihood estimator (MLE) up to a certain order of accuracy, and then subtract such bias from the MLE. This approach entails a great deal of algebra but has the nice feature that the final expressions are usually simple enough that they can be easily used by practitioners. An alternative approach is the use of a resampling technique, such as the bootstrap, a computer-based method of assessing and improving the accuracy of estimators and tests. The bootstrap was originally proposed by Efron (1979), and has been the focus of a number of recent review articles and monographs; see, e.g., Efron (1982); Efron and ´ ´ Tibshirani (1986), (1993); Hall (1992), (1994); Gonzalez et al. (1994); Leger et al. (1992); Shao and Tu (1995); Young (1994). Its advocates argue that it avoids the need for messy algebraic derivations, thus offering a practical way of designing more accurate estimation and testing procedures. Advocates of the analytical approach, on the other hand, try to refute this argument by saying that the bootstrap requires intensive computer programming which, although simple for statisticians and econometricians, is not appealing to applied practitioners in other fields. It is thus interesting to ask: To *Corresponding author. Tel.: 11 618 5367746; fax: 11 618 4532717; e-mail: [email protected] 0165-1765 / 98 / $19.00  1998 Elsevier Science S.A. All rights reserved. PII S0165-1765( 97 )00276-0

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15

8

what extent are the analytical and bootstrap approaches related? This is certainly an important question, and has been addressed to a certain extent by Peter Hall in his comprehensive monograph (Hall, 1992). In this paper we focus on bias correction of the MLE of u, the parameter that indexes a given distribution. We show that there is a close connection between the bootstrap-based bias correction and the correction derived by Ferrari et al. (1996) using an analytical approach based on series expansions. The bootstrap then provides a natural way of defining the finite-sample bias correction. For example, Ferrari et al. (1996) define a third order bias-corrected estimator using a coefficient that does not come from the expansion of the bias of the MLE, and it is not clear why such estimator should be unbiased to third order. This becomes clear, however, when the derivation of the third order bias-corrected estimator is done using the bootstrap approach, as we show in Section 2. Also, the bootstrap-based approach allows us to define new estimators obtained using multiplicative bias correction. We also use the bootstrap approach to give general expressions for the third order bias-corrected maximum likelihood estimator in multiparameter models. Again, the bootstrap provides a natural way of defining this estimator since it automatically modifies the third-order term from the expansion of the bias of the uncorrected estimator, so that the corrected estimator becomes unbiased to the third order. The remainder of the paper is organized as follows. The main result is stated in the next section. General expressions for bootstrap bias-corrected maximum likelihood estimators in the multiparameter case are given in Section 3. Simulation results comparing the finite-sample performance of the MLE and its bias-corrected versions based on the bootstrap and on analytical series expansions are given in Section 4. Section 5 closes the paper with a few concluding remarks.

2. The main result Let Y1 , . . . , Yn be a set of n independent and identically distributed random variables each having probability or density function of the form f0 ( y;u ) 5 expht( y;u )j and corresponding distribution function F0 ( ? ;u ), where u is a scalar unknown parameter. Under mild regularity conditions, the bias and the variance of the MLE of u, say uˆ , can be written as B1 (u ) B2 (u ) 23 B(u ) 5 ]] 1 ]] 1 2 (n ) n n2

(1)

V1 (u ) V2 (u ) 23 V(u ) 5 ]] 1 ]] 1 2 (n ), 2 n n

(2)

and

respectively, where B1 (u ), B2 (u ), V1 (u ) and V2 (u ) are functions of cumulants of log-likelihood derivatives with respect to u for a single observation. That is, they are functions of quantities such as kuu 5Eht0( y;u )j, kuuu 5Eht-( y;u )j, kuu ,uu 5E[ht0( y;u )j 2 ]2 k 2uu , kuu ,uuu 5Eht0( y;u )t-( y;u )j2 kuu kuuu , primes denoting derivatives with respect to u ; see Shenton and Bowman (1977, pp. 44–47) and Ferrari et al. (1996). It is clear that, depending on the distribution f0 , it may be very difficult or even impossible to obtain B1 (u ), B2 (u ), V1 (u ) and V2 (u ). They can, however, be easily obtained for a

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15

9

number of distributions. For one-parameter exponential family models, simple expressions for these quantities are given by Ferrari et al. (1996). Three bias-corrected MLEs for u are given by B1 ( uˆ ) B1 ( uˆ ) B2 ( uˆ ) u˜ 1 5 uˆ 2 ]], u˜ *2 5 uˆ 2 ]] 2 ]] n n n2

(3)

B1 ( uˆ ) B 2* ( uˆ ) u˜ 2 5 uˆ 2 ]] 2 ]] , n n2

(4)

and

where V1 (u )B 199 (u ) B *2 (u ) 5 B2 (u ) 2 B1 (u )B 91 (u ) 2 ]]]]. 2

(5)

Ferrari et al. (1996) have shown that these three modified estimates are bias-free to order n 21 , but 22 only u˜ 2 has no bias to order n . Note, however, that it is not transparent why u˜ 2 should be unbiased to the third order of accuracy since B *2 (u ) is not the n 22 term in the expansion of the bias of uˆ . In what follows we show that the modified estimates u˜ 1 and u˜ 2 can be obtained in a natural way using bootstrap arguments. We shall follow the notation in Hall (1992). The parameter of interest is written as the functional u 5u (F0 ) and uˆ 5u (F1 ) denotes the MLE of u from a sample Y1 , . . . , Yn . The MLE of u from a sample generated from F0 with u 5 uˆ is denoted by u *5u (F2 ) and the MLE of u from a sample generated from F0 with u 5u * is denoted by u **5u (F3 ). Conditional expectations on F0 , F1 and F2 mean that such expectations are taken with respect to the distribution F0 with u as the true parameter value, u 5 uˆ and u 5u *, respectively. As argued by Hall (1992) and others, much of statistical inference amounts to describing the relationship between a sample and the population from which it was generated. Formally, given a functional ft [h ft :t[t j, we wish to determine the value of t that solves the ‘population equation’ Eh ft (F0 , F1 )uF0j 5 0.

(6)

In fact, correcting uˆ for bias is equivalent to finding t that solves Eq. (6) when ft (F0 , F1 ) 5 u (F1 ) 2 u (F0 ) 1 t if an additive correction is desired, or ft (F0 , F1 ) 5 (1 1 t)u (F1 ) 2 u (F0 ) if a multiplicative correction is to be used. In order to obtain an approximate solution to the population equation, the bootstrap method replaces (F0 , F1 ) in Eq. (6) by (F1 , F2 ), thus yielding the ‘sample equation’ Eh ft (F1 , F2 )uF1j 5 0.

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15

10

This procedure can be repeated iteratively; see (Hall, 1992, §1.4). It can be shown that, after j iterations, the bias-corrected MLE of u, say uˆj , is given by j 11

uˆj 5

OSj 1i 1D(21)

i 11

Ehu (Fi )uF1j

i51

for j] $ 1, when additive corrections are used in each iteration; see (Hall, 1992, Theorem 1.3) and Hall and Martin (1988). Then,

uˆ 1 5 2uˆ 2 Ehu *uF1j

(7)

uˆ 2 5 3uˆ 2 3Ehu *uF1j 1 Ehu **uF1j.

(8)

and

The evaluation of Eq. (7) by simulation requires a number N, usually large, of simulated samples used to estimate Ehu *uF1 j, where the samples are drawn from the distribution F0 with u 5 uˆ . N 2 simulated samples are necessary to estimate Ehu **uF1 j using the two-iteration bootstrap. We shall now use the first and second order iterated bootstrap bias-corrected estimates in Eq. (7) and Eq. (8) to present a natural way of obtaining the bias-corrected estimates u˜ 1 and u˜ 2 in Eq. (3) and Eq. (4), respectively. Our results will follow by formal expansions without explicit attention to the underlying regularity conditions, those being essentially the same required for the expansions needed for maximum likelihood theory in regular problems. First, consider the one-iteration bootstrap estimate uˆ 1 in Eq. (7). Since Ehu *uF1 j5 uˆ 1B( uˆ ), it follows that uˆ 1 5 uˆ 2B( uˆ ). Then, we have from Eq. (1) that, to stochastic order n 21 , B1 ( uˆ ) uˆ 1 5 uˆ 2 ]] n which is the second-order bias corrected estimate given in Eq. (3). Next, consider the two-iteration bootstrap estimate uˆ 2 in Eq. (8). We can write

uˆ 2 5 3uˆ 2 3huˆ 1 B( uˆ )j 1 EhEhu **uF2juF1j 5 2 3B( uˆ ) 1 Ehu * 1 B(u *)uF1j 5 2 3B( uˆ ) 1 uˆ 1 B( uˆ ) 1 EhB(u *)uF1j 5 uˆ 2 2B( uˆ ) 1 EhB(u *)uF1j. Expanding B(u *) around uˆ in a Taylor series we have that (u * 2 uˆ )2 B0( uˆ ) (u * 2 uˆ )3 B-( uˆ ) B(u *) 5 B( uˆ ) 1 (u * 2 uˆ )B9( uˆ ) 1 ]]]]] 1 ]]]]] 1 ? ? ? , 2 6 where the terms of this expansion have stochastic orders n 21 , n 23 / 2 , n 22 , n 25 / 2 , . . . , respectively. We have that Eh(u *2 uˆ )uF1 j5B( uˆ ) and that Eh(u *2 uˆ )2 uF1 j5B( uˆ )2 1V( uˆ ). It then follows that B0( uˆ )hV( uˆ ) 1 B( uˆ )2j uˆ 2 5 uˆ 2 2B( uˆ ) 1 B( uˆ ) 1 B( uˆ )B9( uˆ ) 1 ]]]]]]] 2 to stochastic order n 22 . Now, from Eq. (1) and Eq. (2) we have that, to this order,

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15

11

B1 ( uˆ ) B2 ( uˆ ) B1 ( uˆ )B 19 ( uˆ ) V1 ( uˆ )B 199 ( uˆ ) uˆ 2 5 uˆ 2 ]] 2 ]] 1 ]]]] 1 ]]]] n n2 n2 2n 2 or B1 ( uˆ ) B 2* ( uˆ ) uˆ 2 5 uˆ 2 ]] 2 ]] , n n2 where B 2* ( uˆ ) is given in Eq. (5), which is the third order bias-corrected MLE in equation (4). It is noteworthy that by obtaining this bias-corrected estimator using bootstrap arguments, it becomes transparent that the n 22 term in the expansion of the bias of uˆ should not be used in the correction, a modified version of it that involves the n 21 term of the expansion for the variance of uˆ being required. Therefore, the bootstrap provides a natural way of defining bias-corrections of maximum likelihood estimators. The bootstrap-based bias reduction approach also allows us to define bias-corrected MLEs using a multiplicative correction. After j] $ 1 iterations, the (multiplicative) bias-corrected MLE of u, say u¯j , is given by (Hall, 1992)

u (F1 )uj21 (F1 ) u¯j 5 uj (F1 ) 5 ]]]]], Ehuj21 (F2 )uF1j where u0 (F1 )5u (F1 )5 uˆ and u0 (F2 )5u (F2 )5u *. We then have that

uˆ 2 ¯u1 5 ]]] uˆ 1 B( uˆ ) and

uˆ 3 u¯ 2 5 ]]]]]]]]]]] . ( uˆ 1 B( uˆ ))Ehu * 2 /(u * 1 B(u *))uF1j It can be shown, after some algebra, that, if uˆ ±0 a.s.,

uˆ 2 ¯u1 5 ]]]] B1 ( uˆ ) uˆ 1 ]] n and

uˆ 2 u¯ 2 5 ]]]]]]], B1 ( uˆ ) B *2 *( uˆ ) uˆ 1 ]] 1 ]]] n n2 where B *2 *(u )5B2 (u )2B1 (u )B 91 (u ), to stochastic orders n 21 and n 22 , respectively. These two corrected estimates are unbiased to orders n 21 and n 22 , respectively. It is noteworthy that the n 22 term from the expansion of the bias of uˆ had to be modified again, this time to B 2* *(u ), in order to

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15

12

deliver an estimator unbiased to third order. Again, this was accounted naturally by the bootstrap approach.

3. The multiparameter extension There are a number of published papers that obtain analytical bias corrections for multiparameter models up to order n 21 . For instance, Cordeiro and McCullagh (1991) have derived analytical bias corrections up to the second order for maximum likelihood estimators in generalized linear models, and Cordeiro and Klein (1994) have obtained similar corrections for ARMA models. The bootstrap approach for obtaining a general specification for second and third order bias-corrected maximum likelihood estimators via bootstrap in Section 2 can be generalized to cover the multiparameter case where u is a p-vector of parameters, that is, u 5(u1 , . . . ,up )T . The second-order bias-corrected ( j) estimator of the jth element of u is given as u 1( j ) 5 uˆ 2n 21 B (1 j ) ( uˆ ), for j51, . . . , p. In order to obtain an expression for the maximum likelihood estimator bias-corrected up to the third order, we need to expand B ( j ) (u *) around uˆ as p

B (u *) 5 B ( uˆ ) 1 ( j)

( j)

O

≠B ( j ) ( uˆ ) ≠u(i )

1 2

(u *(i ) 2 uˆ(i ) ) ]]] 1 ]

i 51

p

p

OO

≠ 2 B ( j ) ( uˆ ) ≠u(i ) ≠u(k )

(u *(i ) 2 uˆ(i ) )(u *(k ) 2 uˆ(k ) ) ]]]

i 51 k 51

1 ? ? ?, where uˆ(i ) and u *(i ) denote the ith element of the vectors uˆ and u *, respectively. Also, B ( j ) (u ) represents the jth element of the bias vector B(u ). Since Eq. (8) holds for each element of the parameter vector u, it can be shown that p

( j) uˆ 2 5 uˆ( j ) 2 B ( j ) ( uˆ ) 1

p p ≠B ( j ) ( uˆ ) (i ) ≠ 2 B ( j ) ( uˆ ) 1 ˆ ]]] B ( u ) 1 ] ]]] Cik ( uˆ ), ≠u(i ) 2 i51 k 51 ≠u(i ) ≠u(k ) i51

O

OO

( j) where uˆ 2 denotes the jth element of uˆ 2 and Cik ( uˆ )5Eh(u *(i ) 2 uˆ(i ) )(u *(k ) 2 uˆ(k ) )uF1 j. That is,

* 2 Ehu (k) * uF1j)uF1j 1 Eh(Ehu (i*) uF1j 2 uˆ(i ) )(Ehu *(k) uF1j 2 uˆ(k ) )uF1j Cik ( uˆ ) 5 Eh(u (i*) 2 Ehu (i*) uF1j)(u (k) * ) uF1 ) 1 B (i ) ( uˆ )B (k ) ( uˆ ) 5 Cov( uˆ(i ) , uˆ(k) ) 1 B (i ) ( uˆ )B (k ) ( uˆ ), 5 Cov(u *(i ) , u (k where Cov( uˆ(i ) , u(k ) ) denotes the covariance between uˆ(i ) and uˆ(k ) given F0 and evaluated at uˆ . Under mild regularity conditions, the covariance of uˆ can be expanded as (Shenton and Bowman, 1977, 3.4) Cov 1 (u ) Cov 2 (u ) 23 Cov(u ) 5 ]]] 1 ]]] 1 2 (n ). 2 n n It then follows that B 1( j ) ( uˆ ) B 2* ( j ) ( uˆ ) ˆu (2 j ) 5 uˆ( j ) 2 ]]] 2 ]]] , n n2 where

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15 p

B *2 ( uˆ ) 5 B 2 ( uˆ ) 2 ( j)

( j)

≠B ( uˆ ) O ]]] B ≠u

i51

( j) 1

(i )

p

(i ) 1

13

p

≠ 2 B 1( j ) ( uˆ ) 1 (ik) ]]] ( uˆ ) 2 ] Cov 1 ( uˆ ). 2 i 51 k 51 ≠u(i ) ≠u(k)

OO

Here, Cov 1(ik ) ( uˆ ) is the n 21 term from the expansion of Cov( uˆ(i ) , uˆ(k) ) evaluated at uˆ . In matrix notation, 1 B *2 ( j ) 5 B (2 j ) ( uˆ ) 2 d ( j ) ( uˆ )B1 ( uˆ ) 2 ] trhD ( j ) ( uˆ )C( uˆ )j 2 where ≠B (1 j ) (u ) ≠ 2 B 1( j ) (u ) ( j) d (u ) 5 ]]], D (u ) 5 ]]] , C(u ) 5 Cov 1 (u ). ≠u ≠u ≠u T ( j)

We can also write, in a vectorized way, B1 ( uˆ ) B 2* ( uˆ ) uˆ 2 5 uˆ 2 ]] 2 ]] , n n2 where 1 B *2 ( uˆ ) 5 B2 ( uˆ ) 2 d T ( uˆ )B1 ( uˆ ) 2 ] R( uˆ ), 2 with d(u ) 5 (d ( 1 ) (u ), . . . , d ( p) (u )), R(u ) 5 (trhD ( 1 ) (u )C(u )j, . . . , trhD ( p) (u )C(u )j)T . Some remarks are in order. First, the bootstrap again provided a natural way of obtaining the general form of the third-order bias-corrected estimator since this estimator involves B *2 ( uˆ ) instead of B2 ( uˆ ). Second, for specific models one can obtain closed-form expressions for the n 21 and n 22 terms in the expression given above for uˆ 2 using an analytical approach. Alternatively, this third order bias-corrected MLE can be obtained using a two-iteration bootstrap scheme. The two approaches are equivalent up to order n 22 .

4. Simulation results This section presents Monte Carlo simulation results comparing the performance of uˆ 1 , uˆ 2 , u¯ 1 , u¯ 2 , and the additive and multiplicative bootstrap-based bias corrected MLEs, say uˇa and uˇm , respectively. The bootstrap bias-corrected estimators are based on one iteration. All results are based on 5000 replications. The bootstrap-based bias corrected MLEs are computed using N5200 bootstrap replications. Each simulation experiment thus entails one million total replications. (We did not consider two-iteration bootstrap estimators since they would require 200 million total replications for the entire simulation.) The bootstrap estimates were computed using nonparametric bootstrap with the resampling procedure based on the empirical distribution function since this form of bootstrap is more widely used by practitioners. We consider an extreme value distribution with parameters u and f,

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15

14

Table 1 Simulation results u



n

25

0

10

u˜ 1

u˜ 2

u¯ 1

u¯ 2

Mean

MSE

Mean

MSE

Mean

MSE

Mean

MSE

uˇa

Mean

MSE

uˇm

Mean

MSE

Mean

MSE

5

23.9

23.2

24.9

22.1

25.0

22.1

25.4

729.6

25.2

221.6

24.7

22.7

24.8

48.2

10

24.4

11.0

24.9

10.6

25.0

10.6

25.0

11.8

25.0

14.9

24.9

10.8

25.0

16.8

20

24.7

5.1

25.0

5.0

25.0

5.0

24.9

18.0

25.0

6.4

24.9

5.0

25.0

5.1

30

24.8

3.4

25.0

3.3

25.0

3.3

25.0

3.3

25.0

3.3

24.9

3.4

25.0

3.4

40

24.8

2.5

25.0

2.5

25.0

2.5

25.0

2.5

25.0

2.5

25.0

2.5

25.0

2.9

5

1.1

23.2

0.1

22.1

0.0

22.1

20.1

92.3

20.2

262.4

0.3

22.7

0.3

261.7

10

0.6

11.0

0.1

10.6

0.1

10.6

20.3

2060.8

1.0

1721.6

0.1

10.8

20.2

1952.4

20

0.3

5.1

0.0

5.00

0.0

5.0

0.1

28.6

0.2

152.7

0.1

5.0

0.1

11.2

30

0.2

3.4

0.1

3.3

0.1

3.3

0.0

23.2

20.0

30.0

0.1

3.4

0.1

54.9

40

0.2

2.5

0.0

2.5

0.0

2.5

0.0

3.5

0.0

3.3

0.1

2.5

22.6

36022.4

5

11.1

23.2

10.1

22.1

10.0

22.1

10.2

21.6

10.1

21.5

10.3

22.7

10.4

22.1

10

10.6

11.0

10.1

10.7

10.1

10.6

10.1

10.6

10.1

10.6

10.1

10.8

10.2

10.7

20

10.3

5.1

10.0

5.0

10.0

5.0

10.1

5.0

10.1

5.0

10.1

5.0

10.1

5.0

30

10.2

3.4

10.1

3.3

10.1

3.3

10.1

3.3

10.1

3.3

10.1

3.4

10.1

3.4

40

10.2

2.5

10.0

2.5

10.0

2.5

10.1

2.5

10.1

2.5

10.1

2.5

10.1

2.5

where 2`,u ,`, the location parameter, is unknown and f .0, the scale parameter, is known, in which case B1 (u )5 f / 2, B2 (u )5 f / 12, B *2 (u )5 f / 12 and B 2* *(u )5 f / 12. The simulations were performed using u 5 25, 0, 10 and f 510. The mean values and mean squared errors of all estimates are given in Table 1. The figures in Table 1 show that the small sample bias of the MLE can be quite pronounced. For example, when n55 and u 50, uˆ was on average equal to 1.1. When u 5 25, the mean values for uˆ were 23.9 and 24.4 for n55 and n510, respectively. Also, the additive bias correction is shown to be more reliable since the corrected estimators obtained in a multiplicative way can display quite large mean squared errors when u is close to zero. For instance, when u 50 and n540, the mean squared error of uˇm was over 14 000 times greater than the mean squared error of uˆ . Overall, the third order additive analytical bias corrected estimator seems to be the most reliable of all estimators considered. Indeed, u˜ 2 displayed nearly zero bias in all cases, even with quite small sample sizes. 5. Concluding remarks As shown by Hall and Martin (1988), the iterated bootstrap leads to approximations of successive higher orders of accuracy. In this paper we have shown that the iterated bootstrap provides a natural way to obtain the general coefficients of analytical bias corrections based on series expansions. We have also used the bootstrap to propose two new corrected estimators obtained in a multiplicative fashion. Our simulation results show that both analytical and bootstrap bias corrections can be quite effective. They also show that multiplicative corrections can display poor performance when the maximum likelihood estimator is close to zero in some cases. Such corrections should be avoided

S.L.P. Ferrari, F. Cribari-Neto / Economics Letters 58 (1998) 7 – 15

15

since their possible gains are small but their possible losses are large. A general form for the third order bias correction of maximum likelihood estimates in the multiparameter case was also obtained. Closed-form expressions for its coefficients can be obtained using an analytical approach. A similar level of accuracy can be achieved by the two-iteration bootstrap.

Acknowledgements We wish to thank Denise Botter and Roger Koenker for comments on a previous version of our paper. This paper was concluded when the second author was visiting the University of Illinois. The first author thanks the financial support from CNPq / Brazil and the second author gratefully acknowledges a research grant from the Office of Research Development Administration at Southern Illinois University.

References Cordeiro, G.M., Klein, R., 1994. Bias correction in ARMA models. Statistics and Probability Letters 19, 169–176. Cordeiro, G.M., McCullagh, P., 1991. Bias correction in generalized linear models. Journal of the Royal Statistical Society B 53, 629–643. Efron, B., 1979. Bootstrap methods: another look at the jackknife. Annals of Statistics 7, 1–26. Efron, B., 1982. The Jackknife, the Bootstrap and Other Resampling Plans. SIAM, Philadelphia. Efron, B., Tibshirani, R.J., 1986. Bootstrap methods for standard errors, confidence intervals, and other measures of statistical accuracy. Statistical Science 1, 54–96. Efron, B., Tibshirani, R.J., 1993. An Introduction to the Bootstrap. Chapman and Hall, New York. Ferrari, S.L.P., Botter, D.A., Cordeiro, G.M., Cribari-Neto, F., 1996. Second and third order bias reduction for one-parameter family models. Statistics and Probability Letters 30, 339–345. ´ ´ Gonzalez Manteiga, W., Prada Sanchez, J.M., Romo, J., 1994. The bootstrap: A review, Computational Statistics 9, 165–205. Hall, P., 1992. The Bootstrap and the Edgeworth Expansion. Springer-Verlag, New York. Hall, P., 1994, Methodology and theory for the bootstrap. In: Engle, R.F., McFadden, D.L. (Eds.), Handbook of Econometrics, vol. 4. Elsevier Science, Amsterdam. Hall, P., Martin, M.A., 1988. On bootstrap resampling and iteration. Biometrika 75, 661–671. ´ Leger, C., Politis, D.M., Romano, J.R., 1992. Bootstrap technology and applications. Technometrics 34, 378–398. Shao, J., Tu, D., 1995. The Jackknife and Bootstrap. Springer-Verlag, New York. Shenton, L.R., Bowman, K.O., 1977. Maximum Likelihood Estimation in Small Samples. Macmillan, New York. Young, G.A., 1994. Bootstrap: More than a stab in the dark?. Statistical Science 9, 382–415.