Copyright © 1996 IF AC IJ th Trie nni;ll World Cong ress. San Frnnciscn . USA
3d-03 4
A UNIFIE D APPRO ACH TO OPTIM AL ESTIM ATION FOR STOCH ASfIC SINGU LAR SYSTE M
Huansh ui ZJ\an&.Tlanyo u Chal and Xiangjl e Uu Research Cemer ofAutoma tion. Northea.stem University. Shenya ng .110006. P.R. C
AbotnJ
I. INTRODUCTION The study of the state estimation Cor the linear singular systems has received much attention in the last Cew years. Dai(1989), Darouach,etaI. (1993). and Wang (1988) have investigated the solutioo to optimal filteriog problems usiog the least square method . A1thouah these solutions bave been known, the solution to the problem of optimal prediction has oat been given. In filet, it is very difficult to ooIve the optimal prediction problem basocI on least square method, which has been illustrated in Wang(1988). \0 !his paper, the problem of optimal prediction for the singular system is discussed by using innovation theory and projectioo mcthod(Deng and Guo,1989). The optimal predicto r is given with the aid of output predicto r. A1so,the optimal filter and smoother are preseotocl at the same time. The three cases of filtering , prediction and smoothing wiU be shown to bave the uoified fonn .
2 . STATEMENT OF TIlE PROBLEM Consider the foDowing sysIcm:
discre~time
stochastic singular
Mx(k+l)~of>.:«k)+rw(k)
y(k)sH x(k)+v (k)
(\) (2)
Where x(k)eR " .w(k)e R',y(k )eR" .v(k)e R" are its state • system sIocbastit noise, measurement output ,and measurement noise , respectively. M ,of> eR·"·. r eR·"' H e R"- ate constant matrices . \n this paper , the foDowing assumptions will be made for systems (I), (2).
AI : System ( I), (2) is observa ble. i.e.,
..w(:M;
<1»= n, "z eC, z finite.
ran{ ~) -n
A2 : M is singular, system (\) is regular. i. •.• det(:M - of»" 0 A3 : w(\c). v(\c) are vectors of zero mean white noise sequenccs with covariam:e matrices E{ ..(k) .. '(i») =Q..o•• E{1I(k )v'(I)}- Q,o. E{w(k) v'(1)} =0
3876
where the symbol E denotes mathematical expecta tion.
n, = max(n, -r -\'",). and D(q") is stable, i.... , the roots of det D( %) = 0 are outside the unit cir<:le, and .-( k) is zero mean white noise sequence with oovariaoce matrix Q .s(k) is innovation process . From (7) and (8) we obtain the ARMA innovation mod.1
Our aim is to find a linear minimum variance estimator of state %( k + I), denoted x( k + ~k) , based on the observations y(k),y( k - J),···, y(O) ,which minimizes the
mean square error
(9)
minE{( ;r(k + I) - X(k + Ilk)n% ( k + f) - x(k + flk»)}
(3)
Using the algorith m for spcctroI filctorization preaented by ]ezek and Kucera( 1985), the coefficient matrices D"i = 1,2,·· ·n, and covariance matrixQ, may be calcuIat.:d . By (9), the innovation s(k) can be calculated by
As 1=0,1> 0 and 1<0, x(k .. llk) i. called optimal filter,
predictor and smooth er respectively.
3. THE ESTIMATION PROBLEM
.-(k) = A(q~ )y(k) - D,&(k -1)-· ··-D.,.- (k - n,)
3.1. ARMA innovation model
with the initial values s(O),&( I), .. ·,.-(n, -I). Now, we discuss the relations among the innovation s(k) , white noises w(k) , v(k) and output y(k). Solving the following equatio ns
From (I) and (2), we have y(k)=H (M -
Where
is
q -l
the
(4)
backwa rd
shift operator, i.e., A<:cordiog to the standard q";r(k) =;r(k -I) . decomposition of the singUlar system (Cobb, 1984). Therefore , (M - <1lq " can be written as
B(q" ) = D(q" )FH(q" ) + q-(N"'G (q") A(q" )f. = D(q" )EN(q" ) +q ,(H"'H( q-') D(q") = A(q " )A H(q") + q' (N'"J(q ")
r'
(M -
(F. + F.q" +.. ·+F •
•
•
"'
q'"'' )q' ,
1+ a,q +.. ·+a~q
~
wbereA (q" )
£
(11) (12) (13)
whereFH(q") = F, + F,q" +· .. +FNq'H, EN(q'· )=E. +E,q" +···+E" q'N
(5)
+ A.q"+· .. +ANq'N , and G(q"), H(q") are the polynomials with order " , - I , and J(q" )wilb the order If, - l. Using (Il) and (12), from (8) .-(k) has the expression AH(q" ) = A,
where" is Ibe niIpoIeot index . Using (5), (4) is further given by A(q" )y(k) = C(q" ),,(k + t, -I) + A(q" )v(k)
(10)
(6)
1 +a,q" +... +a~q · ~,
C(q~) = C, + C,q' ·+· .. +C".q '''' ·', in which C, = HF,r, i *= 0,1,·· ·,nl -1. Without loss of generality. we can assume
+ q' ''''''D''( q'' )G(q" )w(k + r, -1) + q,(N),, D " (q" )H( q" )v(k)
that C, = O,··.C". = O,C, .. O,C" ... 0 . Thus , (6) becomes A(q" )y(k) = B(q" ) ..(k + r, - I) + A(q" )v(k)
(14)
Similarly, using (13), from (9) y(k) has the expression
(7)
(15)
where If
B(q-t)z Bo +B,q-I+ ·· .+B. . _Hq-(.... ·J·I"
-to - f' and
in which
B, =C,.; . i - O, l .. .. , ~ - f-1.
Noting (7), suppose that Ibe spectral matrix B(e"") Q.B' (e'") + A(e"")Q ,A(e'") IS positive for -,..,; "',;,... Then we have (Kucera ,I979 ) D(q" ).-(k) = B(q" )w(k + r, -1) + A(q" )v(k)
(8)
Lemma 1 E {"(J)& ' (t)} = Q.F,~." ,., £ {v(J),, '(k)} =o.E:" E{y(j) s'(k)}
=Ar.Q;
with F,' = 0,£,'.0, 11., = 0, for i <0. Noting the assump tion 3, from (14) and (IS), the proof is straightforward. It is omitted .
3877
3.2, steady-state optimal white noise esllmatlan and output reCJUSiw prediction
We
denote
by
",
and
(M + K,H)x(k) = ~.%(k -I) + K,y(k) + rw(k -I) - K,v(k) (20) Thus • (I ) and (2) are equivalent to
Y
, w(k+ilk)= I:E{w(k+i)&,,(k + j)lQ;e(k+ j)
~k-I)
(21) (22)
where A=S.K=SK•• ~k-I).srw(k-I)-Kv(k) and S=(M+K,H)~
Umma 2 Under the assumption I. o =(H'.A'H' ... ·(A')~'H'f i. ofMI column rank. proof The proof is omined her.
'Theorem 1 Assume AI - A3 to hold . Assume a stable spectral factor D(q") in (8) to exist. The steady-state optimal estima!o£ of.%(k + /) is given by
),,-•• ,·1 '1- 1- 1
- ~ Q.F;'Q; ~k + j +/-,., + I)
.%(k)= Ax(k -1)+ Ky(k)+ y(k) ~ H.%(k) +v(k)
(16)
po
Similarly. we can obtain the steady-state optimal white noise estimator v( k + Ilk) as
(17)
v(k + ilk} = I:Q.E;Q ;'e(k + j + i)
,'"
X(k + ~k)= Ai(l +1-
Ilk -I) + K,(/)6(k)
(23)
+ Ky(k +/lk -1)+ 1;\k +1-llk -I) with the initial value X(/IO). where tbe steady-
Specially. W(k+llk)=O as i > ,., -I and ,;(k+IIk)""O as
K,(I)=O' x
i > 0,
Basing on the ARMA innovation model (9), the optimal recursive predic!o£s are giYCII as (Deng and Guo. 1989 )
A, -Q.E~Q;'
A,., -H(KA,., +SrQ,.F:.,.,Q;' -KQ.E~_,Q;')-Q.E~_,Q;' A(il"' )Y
(18)
~,
A,_, - I:HA"""(KA,... , +Sf'Q..F;',_,_,(!;' where the opeflltor
iF'
the time argum
is only working
y( k + Ilk)
00
- Q,E'~_"'lQ;1
which means that
A(~ -' ly(k + Ilk) =y(k + ilk) + a,Y
+" '+o,Y
(24) (19)
in which 0': (O'Or'o'. and 1<0 ~k
where
D.(q ·')=O
for
i>n"
~
for
-KQ.E:'~ .,Q;')
j~O
the first part of
F. =0.
E, =0. A. =0 for
where Y
+I -
Ilk - I) is calculated as
i:Sn.
1;\k +1-llk -1)=SN(k +1-llk-I)- Kv(k+¥ -I) (25)
in whi<:h w(k + I-Ilk -I) and )l(t + Ilk -I) are caku1ate
3.3. 'The slate estlmallon problem According to assumption I.
4~)-n
by (16) and (\7) respectively.
,There exists a
Proof: Basiug on the projection tbOOl)' • from (21). we have
matrix K, eR·" such that det(M + K,H) .. 0 . Note that (I) may be written as
3878
The theorem is now establis hed .
x(k+~k -I) ~ Ai(k + I-Ilk -1)+K y(k +~k - I) 26) +i)(k+ I-llk-J ) (
i(k + ~k) ~ i(k + Ilk -I) + K, (I)o(k)
Rmtarlcs: I) (23) shows Ibal Ibe opbmal fillcri.ag, predicti on and smoothi ng estimati on for the singular system have the same funn. Depend ing on the sign of I , the estimato r i(k+~k) is a fi1ter (1=0). a predicto r (I:>Q) and a smoother (I <0) . 2) The resuh of theorem I can be applied to norma1 system as a special cue . In this caae , M is nonsing war, thus K, can be chosen as K, -0, immedi ately (23) become s
(27)
where K,(I) = E (x(k + 1)&' (k)}Q;' . Substitu ting (26) into (27) yields (23 ) NOIeth at q(k+I -I)=S rw(k+ I-I)-K v(k+I ), (25) follows. From (21) and (22). we have
(!
x(k + ~k) ~ M"4/x( k + I-Ilk -I) + K,(I)ll( k) + M " rw(k + 1- Ilk -I)
]X(k+I )=
HA,...l y(k+I) -v(k+ l) y(k + 1 + I) - HKy(k + 1 + J) - H'1
-,
y(k + 1+ n -1)-l:HA~"
(Ky(t
•• 0
Since M is non.ing u1ar , it is obvious that
4. ASYMPTOTIC STABILITY
-l'(k+ l+n-l)
x(k +/)=0 '
r, cOin (6) and
"" 0 in (7)
+ 1+i + I) + r,{t + I +i))
Note that 0 = (H' ,ATH', .. ·,(A"') ' HT)' is rank, further we have
(29)
As have shown in theorem \ , to caleulat e the optimal stale
estimato r, the innovat ion initial values 1l(0),0(1), .. ·,Il(", -I) , and state estimat or initial value i(/IO) must be taken. In this seetiOl1, wc shall conoidor the
fun of column
problem of whether or not the state estimal orrclies on the initial values as k -+ co, i.e .. the asympto tic stability fur the initial values. This problem is importa nt in real cases where the initial values are nol true.
x
The01'em
y(k +1)- v(k + I) yet + 1 + 1) - HKy(k + 1+ 1)- Hr,{k + 1)- I'(t+ 1+ I)
xCk + ~)
If A is stable, the optimal stale estimato r is asympto tic stable for the innovat ion initial
2
values .-(0), .,1),"', 0(", -I) and state estimator initial value x(lIO).
-, y(k + 1 + n -I) - 'f. HA'''' ( Ky(k + 1 + 1+ \)+ q(t + I + I» •• 0
where
o· _(0'01" 0 '
-1'("'+ I+n-1) (28)
Proof Suppos e that tf"(j), xO' (lIO) and tf"(j), u, (lJo)(j = 0, I. ... ,n, - I jare arbitrar ily M> sets of initial values. Basing on the M> sets of initial values, and using (10). (25), (18) and (23), two sets estimat e values of
x
Using lemma I and (28), the steady·s tate gain matrixK .(/) is given by
K.(I)
=0'
=E {x(k + I) &T(k)}Q ;'
innovat ion, white noise, output, and state are calculat ed , which are denoted as tf')(k). 1I"(k + I-Ilk - I),
x
yO' (k + Ilk -I),xo' (k + ~k) and tf" ( k) , if' (k + I-I lk -I), yPl(k + ~k -I), x~' (k +~k) A, - Q.E:0;'
A,,, -
H(KA~ ,
+ SrQ.F;+,Q;' - KQ,E:,. ,Q;') - Q.E:'.,Q;'
I) Since tf"(k) ,s"'(k) are calculatod recursively by (IO) based on the two sets innov81ion initial values tf"(j), ..~(j),j a 0,1 .. · ·,n. - I , we have the following equation as "~n,
(30)
3879
Owing to D(q-') is stable. we have ?"(k) - G"'(k) -> O. as k -> 00 . From (16), (17) and (25). it is easily to Tfl(k + I-Ilk -1)- ifl(k +/ -Ilk -1)->0 . as k->oo. 2) Using (18). there results y'O(k +
II
k) = -a,y(k) - a,y(k - I)- ... -a,y(t -
II
and v(t + I-Ilk - l)can be calculated by (IS) and (17) respectively. and A, = A + K,H is stable. Similarly wc can prove that the optimal recursive estimator (35) is asymptotically stable for the initial values 6(O).£( I)... ·,£(n, -I) and X(~O). Therefo r. when A is unstable. we substitute the optimal estimator (23) by (35).
n, + I)
+ D.s"' (t)+·· ·+D" t!"(k - n, + I) y~' (k +
j(t + I-Ilk -I)
where
k) = -a,y(t) - a,y(k -I )- -.. -a,y(k - n,
5. ESTIMATION EXAMPLE
(31)
As an example. _ ronsido r the singular disc_t ime system de.:ribed by the following equations
+ I)
+D.s"' (k)+ .. ·+D,c" ' (k-n. +1)
(32)
Mx(t + I)dl>x (k) + r.,,(t) y(k)= Hx(k)+ v(k)
By I). it is obvious that Y·) (k + Ilk) - yQ)(k + Ij.t)-> 0 as
k->oo.
M=[~~]. ~-[~: 0~5]
where
Similarly. wc can prove that Y·' (k + ilk) - y~)(k + ilk) -> 0 for i>1 as k-+oo .
+K(yO '(k + Ilk -1)- Y·)(k +~k
r=[0~5l
H=[O I). x(k). w(k). y(k) and v(k) have the same definition as in equations (I) and (2) (n = 2.m-l .r-I) . The varianc e. of noises w(k) and v(k) are Q. - 4 and Q. = I. By (9) and (S) . the ARMA innovation model is Biveu by
3) From (23). wc have (/ - Aq'" )(.i°)(t + ~k) - .i"'(k + ~t» = 5(k.l)
(1- 0.8q'" )y(k) = (J +dq-' )e(k)
(33)
where 5(k.I) ~K, (I)(ff') (k) - &"'(t»
(36) (37)
(3S)
d and Q, CaD be calculated as d~ -1.448 and Q,-5.5241 Setting K, :[0 I],. from theorom I. the optimal filter. predictor and smooth er for (36) and (37) are given by
-I»
+ Tfl(k+ I-llk-l )- if'(k+ I-Ilk- I). Using I) and 2). we have 5(k,l)-+ O as k -+00. According to the assumption , A is stable • Therefore . (Zbang. and Deng. 1994) XOl(t + Ilt)-.i" (k +llk)-+ O as k -+ 00 . This completcs the proof
I
x(U) ~
OoS 0]irk -Ilk -I)
[ -I
0.5
+K, (O)£(k)+[~]Y(klt -I)
Remark: In theorem 2 we restrict atlmltion to matrix A having ZIlfOS strictly inside the unit cude. This was necessary SO that the etrect of arbitrar y initial ronditio ns in the estimator would decay. Now we conside r the more general case in which A can have roots OD or outside the unit WI;\e. From lomrna 2, rank (H' . A'H' ... .(A')"-· H'y =n. there exists K, eR'~ such that A + K.H is stable. From (2) we
-«k+llk)=
[0.8 0] -I 0.5
+ K, (I)s(k) +
have
K,y(k+ I-Ilk -1)=K ,Hi(k+ I - lik -1)+ K,"<.k +1-llk -I) (34)
i(k -Ilk) =
Combing (23) and (34) • we obtain a new optimal estimato r
[~:
-«kit- I)
[~}(k + Ilk -I)
- K,Y
3880
(40)
r(t -
0 05
21k -I) + K,( -I)£(k)
+[~]y(k -1)+ 'i](k -21k -I)
ilk +/It)~A,.i(k + I-Ilk -1)+ K,(/)d" k) + KY(k + lit -I)
(39)
(41)
K (0)=[°·5(1- Q~')] •
1- Q ~'
,
_[0.5(0.8 + K.(l)- 0.8+d
6. CONCLUSION
'
A new approec b to optimal estimation of singular systems has been pmieIlted. The optimal estimators are deviated by using projection theory and inoovation analysis method in time domain .The filtering , prediction , and smoothing have been shown to have a unified form . Also, the optimal estimation of white noises have been lliven at the same time (eqns.l 6 and 17) .
K (_1)_[°·5(0.8+ d)Q;'], it.k _21k _1)=[2Q;' ]£(k -I) , • (0.8 + d)Q;'
- Q;'
ji(klk - I) = 0.8y(k -I) + de(k -I) , ji(k + Ilk -I) = 0.8ji(kl k -I), e(k) = -ds(k -I) + y(k) - 0.8y(k -I)
In theorem I, the initial values e(0),s(1 ), "', s(", -I) , and ..(110) arc .wiIab1 e, which i. , h o _ , DOt true in most real
The true and predicting estimation values are shown in Figl and 2. Fig.3 and 4 show th. true and smoothing estimation values.
cases. However, in the case where 00 knowledge of £(0),£(1 ), .. ·, £(", -I), and "(~O) is availab le, the optimal estimation problem can still be haodIed by this method . The reason is that the optimal estimato r (eqn.23 (as A is stable) or cqn.35) is asymptotic stable for th. innovation initial values e(O),s(I ), .. ·,£(n, - I) and estimator initial values ..(110). Thus , the initial values may be taken arbitrari.\y
x ,(k+ I), r,(k + Ilk)
f"
Fig.! True and prediction value .i,(k + Ilk)
Fig.2. True and prediction valuex, (k + Ilk)
x, (k -I),x, (k -ljk)
Fig.3. True and smootbing value .i,(k - Ilk) i. .··
---- I x, (k -I),x, (k - Ilk)
Fig.4 . True and smoothing valud, (k -Ilk)
REFERENCES Anderson, B.D.O., and J.B.Mo ore(!97 9). Optlma ljilterin g, Prectice-HalI, Englewood Cliffs,N.1. Cobb, 1.D.(19 84). Control lability, observability and duality in singular systems . IEEE Tran.s. Alltomat. conIrol, 29, 1076-1082. Dai,L. (1989). Filtering and LQG problem for discn:te--time stocbastic singular systems. IEEE. Trans Alllomal. Contro l, 34, 1105-1108. Darouacb, M., Zasadzinslci, M.,and D.Mebd i(I993 ). State estimation of stochastic singular linear systems . I.J. SyslemsSci., 24, 345-354 . Deng, Z.L,and YXOu o( 1989). Modem time series analysi s and lis applications, modeling, filtering, deconvolutio", prediction and conIrol, Knowledge Press, Beijing. (in Chinese) J=I<,J and V.Kucera (1985). Efficient algorithm for matrix spectral filctorization ,Alllomatica , 21,663-669. Kucera, V.( 1979). Discrete linear cOnJrol. The polynom ial equation approach.Wiley,New York. Wang, E.P. and C.z.WlIIlg (1988). Optimu m =urren ce filtering method for singular discrete stocbastic linear systems: Acta Automatica slnica, 14, 409-415 .(in Chinese) Zhang, H.S., and Z .L.Deng (1994). A new algorithm for selftuning state estimation. Contro l theory and appll., 11, 444-449 .(in Chinese)
3881