Discrete linear estimation for previous stage noise correlation

Discrete linear estimation for previous stage noise correlation

dutomatica, Vol. 7, pp. 389--391. Pergamon Press, 1971. Printed in Great Britain. Correspondence Item Discrete Linear Estimation for Previous Stage ...

185KB Sizes 0 Downloads 17 Views

dutomatica, Vol. 7, pp.

389--391. Pergamon Press, 1971. Printed in Great Britain.

Correspondence Item Discrete Linear Estimation for Previous Stage Noise Correlation* Estimation Lin6aire discriminante d'une pr4c6dente corr41ation de bruit d'6tage Unstetige, lineare Sch/itzung der Ger/iuschkorrelation mit vorangegangener Stufe f~rlcKpeTHoe YlHHe~Hoe oripe~e2ieHvie Koppestaimri

myMOB

rlpe~He~

cTa~rtrt

W. J. S T E I N W A Y t a n d J. L. M E L S A t Here with

Summary--A minimum variance linear sequential estimation algorithm, developed by maximum a posteriori likelihood techniques, is given for the case where the current measurement noise, v(k), is correlated with the plant noise of the previous stage, w(k-- 1).

w(t)

and

are independent white noise processes

COV{W(t), w(z)} = COV{V(t), V(Z)}=

Introduction DISCRETE minimum-variance linear estimation algorithms have been developed for the case of same-stage correlation [1]. Considered here is the case where the measurement noise at stage k is correlated with the plant noise at stage k--1. When these noises are Gaussian, the minimum variance estimate is the maximum a posteriori likelihood estimate [2]. The transition mechanism of interest is described in statevariable form by the first-order, vector difference equation.

x(k + 1) = m(k+

1,

k)x(k) + r(k)w(k).

+fk+l m(k+ 1, ~)G(z)w(T)dz 3k

(0 or equivalently

where

g{v(k)} = 0

COV{W(j), w(k)} =

V.(j)6K( j -- k)

cov{v(j), v(k)} =

Vv(j)6K( j - k)

coy v(j),

x(k + 1) = m(k+ t, k)x(k)+ w*(k)

stochastic processes with moments

g{w(k)} = 0 ,

w(k) = Vvw(j)fK( j - k -

Wv(t)6(t -- z) .

x(k + 1) = m(k+ 1, k)x(k)

z(k) = H(k)x(k) + v(k) w(k) and v(k) are

~w(t)6(t--z)

It is desired to form a discrete model for the system so that, for example, a digital computer could be directly applied to the filter algorithm. Using the standard convolution integral* x ( k + l ) can be written in terms of x(k) as

The observation process is given by

where

v(t)

/*k+ I w*(k)=Jk O ( k + l , T)G(z)w(T)d~. The covariance of the discrete white noise w*(k) is given by 1). cov{w*(j), w*(k)} =

Note that the plant noise w(k--1) is correlated with the measurement noise v(k). This estimation problem arises naturally in the development of discrete models for continuous systems. Consider for example the continuous system

Vw(j)6K(j - k)

j+ t

=

I

m(j + 1, ,)C(z)a',,(z)GT(*)

,dJ

X ¢bT(j + 1, z)dzbt~(j- k).

5c(t) = F( t)x( t) + G( t)w( t) A realistic discrete observation model can be obtained by assuming that the continuous observation is averaged over a finite time interval A so that

z( t) = H( t)x( t) + v( t) . * This work was supported by the Air Force Office of Scientific Research, Office of Aerospace Research, United States Air Force, under Contract F44620-68-C-0023. Received 30 November 1970; revised 11 January 1971. This article was approved for publication in revised form by associate editor L. Meier. 'f Information and Control Sciences Center, Southern Methodist University, Dallas, Texas 75222, U.S.A.

z ( k ) = l fkk_az(t)dt = l f :_a [kI(t)x(t) + v(t)]dt .

* For simplicity the sample interval T is assumed to be unity. 389

390

Correspondence

item

This would represent the practical case of non-inapulsc sampling. This expression for z(k) may be wrillen as

x

O(t, z ) G ( z ) w ( z ) d r + t

I',\Bll

q~(k+l, k)v(kl-l-l(k)wil, t

Message model

.v(k+lt

Observation model

z(k)

Prior statistics

d~w(k)}:O, g{v(k)}=0, d~{x(0)l-u:,-(0) cov{w(j), w(k) l:- Vw(j)C~K(j k) COY{v(j), v(k) I : V,'(j)6K(j-- I~) COV{V(]), w(k)', = V~,,,(.i)6K(] k 1) covl x(O) } - l~(O)

Filter algorithm

X-(k)_ dp(k, k - 1).(-(k-- 1)

Gain equation

K(k) . [ V~(klk - 1)HT(k) + F ( k - l)V,,,,,T(k)]{H(k)Vfc(kl k I)Hr(k) + H(k)r(k--- 1) V~.J(k) + V~,,(k)FT(k--1)HT(k)+ Vdk)l

Variance algorithms

V.e(/dk

H(k)x(k)+v(k)

v(t)dt k-A

by the use of the convolution form for x(t) in terms of x(k). By reversing the order of integration on the double integral and defining

+K(k)[z(k)

tl(k)O(k, k

l).f(k-1)]

H(t)CP(t, z ) d t

~.(z) = k-A

the observation z(k) may be written as

z(k)=H*(k)x(k)+v*(k) where

1, [ : I l l [ R AI~;~IRIqtC~!

1;

H*(k)=-A

l) • ao(k,k - l ) V ~ ( k - 1 ) ~ r ( k , k - 1) + F(k - 1) V,v(k - l)FT(k-- l ) V~(k)-:[l K(k)H(k)]V:ffk]/,- 1) K(k)V,.,.(k)FT(k 1)

k - a H(t)O(t, k)dt Initial ~(0[0) : X-(O)=~t.,.(O), conditions

and

V.~(OIO)--V~(O)

v*( k ) =-~ff*-a [ E ( z ) G ( z ) w ( z ) + v(z)]dr . including z(k). The MAP estimate designated by x(k), is determined by solving the equation Now the covariance of v*(k) is given by

cov{v*(j), v*(k)} =

@[x~(k)]

V~(j)bK(j--k)

+ qJ~(z)]dzbK( j - k). The interesting fact is that the discrete random processes

v*(j) and w*(k) are correlated even though v(t) and w(t) are independent. It is easy to show that

a ln{p[z(k)[x(k), Z(k- l)3 ax(k) x p[x(k)]Z(k-

=X

l)

-A " ~ ( z ) G ( O ~ w ( z ) G r ( z ) O r ( j , z)dr6K( j -

t.~)

The theorem of joint probability is used to simplify equation (3) along with the standard method of maximizing the logarithm of the conditional density. The MAP estimate is then given by the expression

- ~fi-~ [Z(OG(OV~(OGT(OZ'(O

c o v { v * ( j ) , w*(k)} = Vow(j)bg( j -- k -

=0.

k - 1).

Note that v*(k) and w*(k--1) are correlated as postulated in the above model rather than v*(k) and w*(k) as has been previously treated in the literature. The essential points in the development of a filtering algorithm for the correlated noise relation are presented and are given in Table 1. Extensions and applications are indicated. .4 Igorithm A linear estimator is obtained by assuming Gaussian distributions for w(k), v(k) and x(0). Under this assumption it is easy to show that x(k) and z(k) are also Gaussian for all k - 0 . To obtain the maximum a posteriori likelihood estimator (MAP), it is necessary to determine the conditional probability density of x(k) given all observations up to and including z(k). The probability density function pz(ct) and p,,lz(all3) will be written as simply p[x] and p[xlZ], and Z(k)={z(k), z(k--1), z(k--2) . . . . } will be the sequence of observation vectors representing all observations up to and

1)]}lx(k)=X,k)=0.

(4)

The indicated probabilities are Gaussian, thus once the first and second moments of the required densities are specified, with proper substitution, the MAP estimate can be found. The final form of the discrete sequential algorithm is summarized for easy reference in Table 1.

Conclusion Discrete minimum variance linear estimation for the case where the current measurement noise, v(k), is correlated with the plant noise of the previous stage, w(k--l), has been accomplished. The sequential estimation algorithm was developed by deriving the maximum a posteriori likelihood estimate, which is the minimum variance estimate for linear systems with Gaussian noises. The direct extension of this case would consider the current measurement noise, v(k), correlated with previous/future plant noise, w(k± n), n> 1. Multiple correlation may be considered; for example, v(k) correlated with w(k--10), w ( k 5) and w(k+3). The main application for this type of algorithm is in systems where random sampling occurs and a periodic discrete time filter is desired [3]. The random sampling may be due to a Poisson sampler, skips, jitter, etc., where the mean sample rate is approximately equal to the discrete time rate of processing the observed data. For this case noise correlation of the type considered here occurs.

Correspondence ReJerences [l] A. P. SAGE and J. L. MELSA: Estimation Theory with Applications to Communications and Control, p. 282. McGraw-Hill, New York (1971). [2] H. L. VAN TREES: Detection, Estimation and Modulation Theory, P a r t / , p. 62. John Wiley, New York (1968). [3] W. J. SIEINWAY and J. L. MELSA" Discrete estimation algorithms for randomly sampled stochastic signals. 1EEE Trans. Aut. Control AC-15, 335-339 (1970); (also JACC 1970). R6sum6~On donne ici un algorithme d'estimation sequentielle de variation lin6aire obtenu par des techniques de maximum de similitude a posteriori, pour le cas o/l le bruit de mesure v(k) est mis en corr61ation avec le bruit d'appareil de la phase pr6c6dente w(k- 1).

item

391

Zusammenfassung--Ein folgerichtig linear geschiitzter AIgorithmus der Minimum Anzahl der freieren Variablen, welcher a posteriori in einer Maximum Wahrscheinlichkeitstechnik entwickelt wurde, wird fiir den Fall dargesteIIt, in dem die laufende Ger~iuschmessung v(k) mit dem Anlagenger~iusch der Vorstufe w ( k - - l ) in Wechselbeziehung gebracht wird.

Pe3mMe--Pa3pa6oTaH aJ~ropHqbM MHHI,IMa.rlbHO~ FIOCJlelIoBaTeJabHO~ .rIHHe~HO~ OHeHKH ancn~pcmf MeTOjIOM MaKCI4MaJIbHOl~ BepO~ITHOCTH a posteriori, n npHMeH~IeTC~[ ~-Ifl cJ1yqafl, r~Ie nacTofltiiee H3MepeHHe myMOB, D(K), KoppeJlpyeTc~I C myMaM~l yCTaHOBKB rlpe~KHe~ CTa/IHH, W(R--1).