Available online at www.sciencedirect.com
Statistics & Probability Letters 67 (2004) 299 – 306
First-order seasonal autoregressive processes with periodically varying parameters I.V. Basawaa , Robert Lunda;∗ , Qin Shaob a
b
Department of Statistics, The University of Georgia, Athens, GA 30602-1952, USA Department of Mathematics, The University of Toledo, Toledo, OH 43606-3390, USA Accepted February 2004
Abstract A time series model combining a -rst-order periodic autoregressive structure and the Box–Jenkins multiplicative seasonal autoregressive model is introduced. Stationarity conditions (in the periodic sense) for this so-called SPAR(1,1) process are established and its autocovariances are derived. Least-squares estimates of the model parameters are obtained and their limit distribution is derived. An extension to higher-order SPARMA models is suggested. c 2004 Elsevier B.V. All rights reserved. Keywords: Autoregression; Least-squares estimation; Limit distributions; Periodic time series; Seasonality
1. Introduction Seasonal time series models have been extensively explored in the literature. The classical Box– Jenkins SARMA model (Box et al., 1994) for seasonal time series combines two types of dependency: (a) dependence of consecutive observations within a period and (b) dependence of observations between periods. For instance, in monthly unemployment data, one may be interested in the correlations between successive monthly unemployments, and also in the correlations between unemployments in the same month (say January) of successive years. If XnT + denotes the observation during the th month of the nth year, correlations between XnT + and XnT +−1 (successive months) as well as the correlations between XnT + and X(n−1)T + (same month in successive years) may merit study. In this example, the period is T =12. In the Box–Jenkins model, parameters do not vary with the season . Consequently, the correlations in a stationary Box–Jenkins seasonal model do not vary with season. ∗
Corresponding author. E-mail address:
[email protected] (R. Lund).
c 2004 Elsevier B.V. All rights reserved. 0167-7152/$ - see front matter doi:10.1016/j.spl.2004.02.001
300
I.V. Basawa et al. / Statistics & Probability Letters 67 (2004) 299 – 306
On the other hand, there are many applications where model parameters need to vary periodically (i.e. depend on ) to adequately describe the series. This has led to the study of so-called periodic time series models, which has by now enjoyed applications in climatology (Hannan, 1955; Monin, 1963; Jones and Brelsford, 1967; Bloom-eld et al., 1994), economics (Parzen and Pagano, 1979), hydrology (Vecchia, 1985a, b), and electrical engineering (Gardner and Franks, 1975) amongst other disciplines. Both Box–Jenkins seasonal models and periodic time series models lean heavily on the autoregressive-moving average (ARMA) diFerence equation paradigm. Periodic ARMA (PARMA) models, however, typically do not seek to model between-period (i.e. seasonal) dependencies explicitly. In this paper, we introduce a new model by allowing the classical Box–Jenkins seasonal model parameters to vary periodically. This model class contains SARMA and PARMA models as special cases. To keep the presentation simple, we focus primarily on the -rst-order autoregressive setting. Cleveland and Tiao (1979) and Tiao and Grupe (1980) are good general references for early related work. Lund and Basawa (2000) and Basawa and Lund (2001) are recent accounts of inference problems in PARMA time series. The rest of this paper proceeds as follows. Seasonal autoregressive processes with periodically varying parameters (SPAR) are introduced in Section 2. Periodic causality and stationarity conditions and the autocovariances of the SPAR model are derived there. Section 3 is concerned with parameter estimation and derives the limit distribution of the least squares estimators. An extension to general seasonal periodic autoregressive moving-average (SPARMA) models is suggested in Section 4. A comment on the viewpoint taken here is worth making a priori. In the same way that seasonal ARMA (SARMA) models can be viewed as special cases of standard ARMA models, SPARMA models can be viewed as special cases of standard PARMA models. Consequently, the basic properties of SPAR models in this paper could be derived from those of PAR models (Pagano, 1978; Troutman, 1979). However, analysis may be facilitated by proceeding more directly.
2. Seasonal autoregressions with periodic parameters Consider the time series {XnT + } satisfying the seasonal autoregressive diFerence equation XnT + = ()XnT +−1 + ()X(n−1)T + − ()()X(n−1)T +−1 + nT + ;
(2.1)
where {nT + } is uncorrelated periodic white noise with E(nT + ) = 0 and Var(nT + ) = 2 (). We refer to the model in (2.1) as a -rst-order seasonal periodic autoregressive process (SPAR(1,1)) and observe that solution(s) can be regarded as satisfying the two equations XnT + = ()XnT +−1 + ZnT +
and
ZnT + = ()Z(n−1)T + + nT + :
(2.2)
Setting () ≡ , () ≡ and 2 () ≡ 2 for 1 6 6 T , the SPAR(1,1) model reduces to the Box–Jenkins SAR(1,1) model. Taking () ≡ 0 for 1 6 6 T , we obtain a PAR(l) model. Thus, both SAR(1,1) and PAR(l) models are special cases of SPAR(1,1) models.
I.V. Basawa et al. / Statistics & Probability Letters 67 (2004) 299 – 306
The SPAR(1,1) diFerence equation can be written as a PAR(p) model with p = T + 1: T +1 XnT + = ∗i ()XnT +−i + nT + ;
301
(2.3)
i=1
where ∗i () = (), ∗i () = 0 for 1 ¡ i ¡ T , ∗T () = () and ∗T +1 () = −()(). To derive periodic stationarity conditions and autocovariances in the SPAR(1,1) model, let Xn = (XnT +1 ; : : : ; XnT +T ) and ”n = (nT +1 ; : : : ; nT +T ) . The model in (2.1) can also be written as the T -dimensional vector autoregression (VAR) 0 Xn = 1 Xn− 1 + 2 Xn− 2 + ” n ;
(2.4)
where i , i = 0; 1; 2, are T × T matrices whose (r; s)th elements are 1; r = s; r ¡ s; 0 (r; s) = 0; −∗ (r); r ¿ s r −s
(2.5)
and i (r; s) = ∗iT +r −s (r) for i = 1; 2. The causality and periodic stationarity condition for (2.4) is the familiar T T [1 − ()z] 1 − ()z T = 0 det(0 − 1 z − 2 z 2 ) = =1
(2.6)
=1
for all complex z satisfying |z| 6 1. Hence, a unique (in mean square), causal, and periodically stationary solution to (2.1) exists provided T |()| ¡ 1 and |()| ¡ 1 for all seasons : (2.7) =1
The VAR representation in (2.4) suggests that SPAR(1,1) autocovariances can be derived, in principle, from those of the standard multivariate AR process (e.g. Reinsel, 1997, Section 2). However, it is perhaps simpler to proceed directly. The stationary solution to (2.1) has the representation ∞ XnT + = (2.8) k ()nT +−k ; k=0
∞ where the periodic linear -lter weights { k ()}∞ k=0 satisfy k=0 | k ()| ¡ ∞ for each season . The ∞ coeJcients { k ()}k=0 can be determined recursively from k ()
= ()
k −1 (
− 1)I[k ¿1] + ()
k −T ()I[k ¿T ]
− ()()
k −T −1 (
− 1)I[k ¿T +1]
(2.9)
for k ¿ 1 and 1 6 6 T . Here, 0 () ≡ 1 and k () is interpreted periodically in with period T for each -xed k. Let h () = Cov(XnT + ; XnT +−h ) be the autocovariance of {XnT + } at season and lag h ¿ 0. Taking covariances on both sides of (2.8) leads to ∞ 2 h () = (2.10) k+h () k ( − h) ( − k − h): k=1 2
where (j) is interpreted periodically in j with period T .
302
I.V. Basawa et al. / Statistics & Probability Letters 67 (2004) 299 – 306
Eq. (2.10) can be used to compute h () for all seasons and lags 0 6 h 6 T + 1. For lags h ¿ T + 1, a recursion identifying h () is obtained upon multiplying both sides of (2.1) by XnT +−h for h ¿ T + 1, taking expectations, and invoking causality h () = ()h−1 ( − 1) + ()h−T () − ()()h−T −1 ( − 1);
h ¿ T + 1:
(2.11)
In (2.11), h () is interpreted periodically in with period T for each -xed h ¿ 0. 3. Parameter estimation We now consider the problem of estimating the parameters {(); ()}T=1 in (2.1) with data {XnT + } observed over n = 0; 1; : : : ; N − 1 and = 1; : : : ; T . From (2.1), we have nT + = XnT + − ()XnT +−1 − ()X(n−1)T + + ()()X(n−1)T +−1 :
(3.1)
Let () = ((); ()) and = ((1) ; : : : ; (T ) ) . The least-squares estimator ˆ of is obtained N −1 T 2 −2 by minimizing Hence, the least-squares estimating equations n=0 =1 nT + (). for ((); ()) for any -xed are given by N −1
nT +
n=0
@nT + =0 @()
and
N −1
nT +
n=0
@nT + = 0; @()
(3.2)
where @nT + = XnT +−1 − ()X(n−1)T +−1 ; @() @nT + − = X(n−1)T + − ()X(n−1)T +−1 : @()
−
(3.3) (3.4)
ˆ Let ((); ()) ˆ denote a solution to (3.2) for a -xed season . Arguments analogous to those in Basawa and Lund (2001) will establish the following result. Theorem 3.1. Consider a causal SPAR(1,1) model in the sense that for all 1 6 6 T . Suppose further that supt E[t4 ] ¡ ∞. Then ˆ 0 √ () − () d N →N ; 2 ()−1 () 0 () ˆ − () as N → ∞, where () =
E
E
@nT + @()
2
@nT + @nT + @() @()
@nT + @nT + E @() @() @nT + 2 E @()
T
=1
|()| ¡ 1 and |()| ¡ 1 (3.5)
:
ˆ Moreover, ((); ()) ˆ are asymptotically independent across diFerent seasons .
(3.6)
I.V. Basawa et al. / Statistics & Probability Letters 67 (2004) 299 – 306
303
Proof. We give an outline only; technical detail follows arguments similar to those in Basawa and Lund (2001). Let () = ((); ()) . The least-squares estimating equation for () in (3.2) is SN () =
N −1
nT +
n=0
=
N −1
@nT + @()
ZnT + ;
(3.7)
n=0
where ZnT + = nT + ((@nT + )=(@())). From periodic ergodicity results, one can deduce that P
(i) N −1 SN ()→ 0, N −1 P 2 (ii) N −1 ZnT + ZnT + → ()() and (iii) N −1
n=0 N −1 n=0
P
(@ZnT + )=@→ ()
as N → ∞, where () = E
@nT + @()
@nT + @()
:
From (3.3) and (3.4), the elements of () can be computed easily from SPAR(1,1) autocovariances; speci-cally, @nT + 2 E = E(XnT +−1 − ()X(n−1)T +−1 )2 @() = 0 ( − 1) − 2()T ( − 1) + 2 ()0 ( − 1): E
@nT + @()
2
= E(X(n−1)T + − ()X(n−1)T +−1 )2 = 0 () − 2()1 () + 2 ()0 ( − 1):
E
@nT + @()
(3.8)
@nT + @()
(3.9)
= E[(XnT +−1 − ()X(n−1)T +−1 )(X(n−1)T + −()X(n−1)T +−1 )] = T −1 (−1)−()T (−1)−()1 ()+()()0 (−1):
(3.10)
304
I.V. Basawa et al. / Statistics & Probability Letters 67 (2004) 299 – 306
The autocovariances h () needed in (3.8)–(3.10) are identi-ed in (2.10). To obtain our main result, one uses a martingale central limit theorem (cf. Hall and Heyde, 1980, Chapter 3) to deduce that d
(iv) N −1=2 SN ()→ N (0; 2 ()()). A Taylor expansion argument will give the relation ˆ − ()) = −N −1=2 SN () + op (1). (v) (N −1 @SN ()=@())N 1=2 (() Results (iii)–(v) and Slutzky’s Theorem now give d ˆ − ())→N (0; 2 ()−1 ()). (vi) N 1=2 (()
ˆ − ())} across varying can be established following The asymptotic independence of {N 1=2 (() arguments similar to those in Basawa and Lund (2001). A consistent estimate of 2 () is given by
ˆ2 () = N −1
N −1
2 ˆ nT ˆ + ((); ());
(3.11)
n=0
ˆ ˆ can where () and () ˆ are estimators obtained by solving SN () = (0; 0) . It is noted that () be obtained as solutions to the nonlinear equations in (3.2) via a Fisher scoring algorithm. A convenient preliminary estimator (ˆ0 (); ˆ 0 ()) can be calculated by the following step-wise regression method. N −1 T 2 Step 1. Minimizing n=0 =1 ZnT + in (2.2) with respect to () gives ˆ 0 () =
N −1
n=0 XnT + XnT +−1 ; N −1 2 n=0 XnT +−1
(3.12)
where Xj is taken as zero for j 6 0. Step 2. Let Zˆ nT + = XnT + − ˆ 0 ()XnT +−1 . Regressing Zˆ nT + on Zˆ (n−1)T + provides
ˆ0 () =
N −1
ˆ
ˆ
n=1 Z nT + Z (n−1)T + : N −1 2 ˆ n=1 Z (n−1)T +
These preliminary estimates can be used as starting values in a Fisher scoring algorithm.
(3.13)
I.V. Basawa et al. / Statistics & Probability Letters 67 (2004) 299 – 306
305
4. An extension The model in (2.1) can be generalized to a seasonal autoregressive moving-average of order (p1 ; p2 ) × (q1 ; q2 ) by considering solutions to the diFerence equation XnT + =
p1
i ()XnT +−i +
i=1
+
q1 i=1
p2
j ()X(n−j)T + −
j=1
#i ()nT +−i +
q2 j=1
p1 p 2
i ()j ()X(n−j)T +−i
i=1 j=1
j ()(n−j)T + −
q1 q 2
#i ()j ()(n−j)T +−i + nT + :
(4.1)
i=1 j=1
We will refer to the model in (4.1) as SPARMA(p1 ; p2 )×(q1 ; q2 ). If the seasonal parameters () and () are taken as zero, the model reduces to PARMA(p1 ; q1 ) form. If all model parameters in (4.1) are constant in season, the model reduces to the standard Box–Jenkins (product) SARMA(p1 ; p2 ) × (q1 ; q2 ) model. Stationarity and parameter estimation problems could be studied for such a structure, thereby extending the results of this paper; this will be pursued elsewhere.
Acknowledgements Robert Lund’s research was partially supported by the National Science Foundation Grant DMS 0304407. We thank the referee for many constructive suggestions.
References Basawa, I.V., Lund, R.B., 2001. Large sample properties of parameter estimates for periodic ARMA models. J. Time Ser. Anal. 22, 651–663. Bloom-eld, P., Hurd, H.L., Lund, R.B., 1994. Periodic correlation in stratospheric ozone data. J. Time. Ser. Anal. 15, 127–150. Box, G.E.P., Jenkins, G.M., Reinsel, G.C., 1994. Time Series Analysis, Forecasting and Control, 3rd Edition. Prentice-Hall, Englewood CliFs, NJ. Cleveland, W.P., Tiao, G.C., 1979. Modeling seasonal time series. Rev. Econom. Appl. 32, 107–129. Gardner, W., Franks, L.E., 1975. Characterization of cyclostationary random signal processes. IEEE Trans. Inform. Theory 21, 4–14. Hall, P., Heyde, C.C., 1980. Martingale Limit Theory and Its Applications. Academic Press, New York. Hannan, E.J., 1955. A test for singularities in Sydney rainfall. Austral. J. Phys. 8, 289–297. Jones, R.H., Brelsford, W.M., 1967. Time series with periodic structure. Biometrika 54, 403–408. Lund, R.B., Basawa, I.V., 2000. Recursive prediction and likelihood evaluation for periodic ARMA models. J. Time Ser. Anal. 20, 75–93. Monin, A.S., 1963. Stationary and periodic time series in the general circulation of the atmosphere. In: Rosenblatt, M. (Ed.), Proc. Symp. on Time Series Analysis, Wiley, New York, pp. 144–151. Pagano, M., 1978. On periodic and multiple autoregressions. Ann. Statist. 6, 1310–1317. Parzen, E., Pagano, M., 1979. An approach to modeling seasonally stationary time series. J. Econometrics 9, 137–153. Reinsel, G.C., 1997. Elements of Multivariate Time Series Analysis, 2nd Edition. Springer, New York.
306
I.V. Basawa et al. / Statistics & Probability Letters 67 (2004) 299 – 306
Tiao, G.C., Grupe, M.R., 1980. Hidden periodic autoregressive-moving average models in time series data. Biometrika 67, 365–373. Troutman, B.M., 1979. Some results in periodic autoregression. Biometrika 66, 219–228. Vecchia, A.V., 1985a. Periodic autoregressive-moving average (PARMA) modeling with applications to water resources. Water Resour. Bull. 21, 721–730. Vecchia, A.V., 1985b. Maximum likelihood estimation for periodic autoregressive moving average models. Technometrics 27, 375–384.