Non-stationary modelling of vibration signals for monitoring the condition of machinery

Non-stationary modelling of vibration signals for monitoring the condition of machinery

Mechanical Systems and Signal Processing (1990) 4(5), 355-365 NON-STATIONARY FOR MONITORING QI ZHUGE, MODELLING THE YONGXIANG OF VIBRATION CONDIT...

729KB Sizes 0 Downloads 18 Views

Mechanical Systems and Signal Processing (1990) 4(5), 355-365

NON-STATIONARY FOR

MONITORING QI ZHUGE,

MODELLING THE YONGXIANG

OF VIBRATION

CONDITION

SIGNALS

OF MACHINERY

Lu AND SHICHAO YANG

Institute for Fluid Power Transmission and Control, Zhejiang University, Hangzhou, 310027, The People’s Republic of China (Received 7 March 1989, accepted 28 September 1989)

Most previously developed algorithms used for machinery diagnostics and condition monitoring are based on the assumption of signal stationarity. Little attention has been paid to the time-dependent details of vibration signals. In this paper, non-stationary modelling of vibration signals is proposed. An auto-regressive model and the corresponding processes of feature extraction and condition monitoring are investigated in detail. Orthogonal transformation is then introduced for both feature compression and the establishment of a statistical model of the template representing the normal condition. Lastly, the monitoring measure is determined by using similarity analysis of pattern recognition theory. An example for the vibration monitoring of a hydraulic piston pump shows how the proposed method can be used.

1. INTRODUCTION

The measurement and analysis of structural vibration signals are effective in machinery working condition monitoring and diagnostics problems. Vibration signals are sensitive to mechanical faults such as surface wear, structural deformation, imbalances, misalignment, etc. Volin and Hundal have offered three excellent reviews concerning recent developments in this field [l-3]. Many signal processing algorithms have been developed for various faults of different components and systems to extract specific features or discriminants that describe specific working conditions or mechanical faults. The monitoring features proposed decades ago, such as rms value, crest factor and autopower spectrum are now widely used. With the advances in computers and the advent of the signal processing microchip, more sophisticated algorithms have been developed, including ZOOM-FFT, cepstrum, bispectrum, comb-filter, adaptive noise cancelling, envelope analysis, auto-regressive modelling, auto-regressive moving average modelling, time syncro-averaging, inverse filter, orthogonal filter, Hilbert transform etc. Most algorithms, including those listed above, are based on the assumption of stationarity of vibration signals. Less attention has been paid to time-dependent details of the signal. Though the phase spectrum of a Fourier transform contains time information, no useful discriminants can be extracted from a ragged phase spectrum due to noise contamination. Parametric models (AR, ARMA) are powerful for modelling short time series and non-stationary processes. Unfortunately, not many researchers have taken the advantage of this approach. Kitagawa and Gersch [4] have made a thorough investigation into time-variant AR modeling and have concluded that the non-stationary modelling approach could be very powerful if used in condition monitoring and diagnostics. This paper reports and application of this idea. Firstly, non-stationary modelling approaches are discussed. The AR model and corresponding processes of feature extraction and condition monitor355 0888-3270/90/050355+ 11 %03.00/O

@ 1990 Academic Press Limited

356

Q. ZHUGE

ET

AL

ing are investigated in detail. The orthogonal transform is then introduced for both feature compression and to establish a statistical model of the template representing the normal condition of the machine. Finally a monitoring measure is determined by means of similarity analysis using pattern recognition theory.

2. NON-STATIONARY 2.1.

PRODUCTION

OF

MODELLING

A VIBRATION

OF MECHANICAL

VIBRATION

SIGNALS

SIGNAL

In general, vibration signals can be expressed as in Fig. 1. In condition monitoring, the forced excitation of vibration sources can be classified into three types:

x(s 1

H(s)

-

Y(S)

t Source

Transmission

path

Responrr

Figure 1. Production of a vibration signal.

(I) Stationary periodic excitation. The excitation force produced in rotational systems belongs to this type. Stationary periodic forces are produced by mass imbalance, geometric imbalance, misalignment, oil whirl, etc. The excitation and response are essentially suitable for harmonic factorisation. Spectrum analysis based on FFT, including cepstrum, ZOOMFFT, is effective. (2) Stationary rqndom excitation. Mainly produced in the processes of surface wear, pneumatic noise, turbulence etc., structural resonant vibration may be excited on the wideband excitation. Considering the stationarity of the signal, auto-spectral analysis based on FFT is also effective. Various filtering techniques make it possible to extract specific features within certain frequency bands. (3) Impact excitation (periodic). This is related to the main topic of this paper, the excitation forces produced in reciprocating machinery and by many components, such as bearings and gears, may be classified into impact excitation. The vibration responses of typical reciprocating machines (piston pump, compressor and engine etc.), are closely related to the resonant vibration of the structure excited by the movements of the pistons. Strong non-stationary events appear in local time spans within one period. A great deal of information on working condition exists in time-dependent models. Thus, to investigate the non-stationarity of vibration responses, the time variant properties of data models are necessary. 2.2.

SEVERAL

TIME

VARIANT

MODELS

2.2.1. Short time Fourier transform (STFT) [S, 61 The time series x(k), with a window w(k), is transformed by STFT X”(e’“)=

+f w(n-m)x(m)e-jorn In=--m

into the frequency domain

(1)

Xn(ejw) is a function of both time and frequency. No averaging can be applied to STFT. STFT is widely used for the non-stationary modelling of speech signals. Eight years ago,

NON-STATIONARY

MODELLING

OF

VIBRATION

357

SIGNALS

the B&K company developed a method similar to STFf for monitoring engines. No further investigation has been reported recently.

combustion

2.2.2. Wigner spectrum [7,8] The Wigner spectrum of a continuous time function f(t) is

(2) where * indicates complex conjugation. Instantaneous power is defined by the Wigner spectrum. The Wigner spectrum is also a two-dimensional function of time and frequency. 2.2.3. AR(auto-regressive) model The AR model is expressed as: x(k)=

5 a,(k)x(k-j)+e(k) j=l

(34

e(k) - 10, d(k)l.

The local properties of the time signal are described by time-dependent AR coefficients aj(k) and residue variance a:(k). These parameters can be used as a set of monitoring features. The AR model can also be expressed in the frequency domain as: AT- a:(k)

P(k, w) =

2 1 _

;

aj(k)

(3b)

,-Jw,AFi

j=l

where AT is the sampling interval. {P(k, wi)}i can also be used as a set of monitoring features. 2.2.4. ARMA( auto-regressive moving average) model

Similar to the AR model, the time variant ARMA model is expressed as follows: x(k)=

i aj(k)x(k-j)+ j-1

5 b; (k)e(k-j) j=l

(4)

and the ARMA model can be factorised into the frequency domain as equation (3b). 2.3.

TIME

VARIANT

AR

MODEL

The powerful ability of the AR model for short-time series modelling is guaranteed by its strong extrapolation of the auto-correlation function. The time variant AR model is investigated in detail for the non-stationary process of quasiperiodic vibration signal. We first study a simple AR model, a response signal excited by two impacts and transmitted through a second order transfer function is shown in Fig. 2. Figure 3 is its local variance, which proved very useful when the authors applied it to condition monitoring problems. The expression of local variance is

~2(k)=~~~~

x’(k-j)

(5)

which in fact is equivalent to the residue variance of an AR model with an order 0. Although the local variance function that represents the instantaneous power of the time signal is very simple, it is a powerful monitoring feature, especially when combined with band-pass filtering techniques.

358

Q. ZHUGE

ET AL

Time

Figure

2. Response

excited

by two impacts

and transmitted

through

a second-order

0

transform

function.

T Time

Figure

3. The local variance

of the time signal,

p = 10.

Further work is concentrated on the algorithms for time variant AR modelling. Many approaches have been proposed for obtaining time-dependent AR coefficients, such as adaptive algorithms [9, lo], the Kalman filter method [4] and the algorithms proposed by Segen and Sanderson [ll], Charbonnier et al. [12] etc. Two approaches are adopted here for machinery condition monitoring. One is the time variant algorithm based on Kalman filter theory and is used for the establishment of the template model which represents the normal working condition. A lot of computing time has to be used to obtain a very accurate time variant model. The other approach is a time invariant algorithm applied to a signal from which the working condition represented is to be evaluated. This method is based on the assumption of local stationarity of the time signal. The computing time is relatively short. 2.3.1. Time invariant algorithm The time signal x(k) is divided into overlapped short spans and every span is assumed to be locally stationary so that a time invariant algorithm can be used. The last computed results are several series of time-dependent model coefficients. The AR predictor is described as x(k)=-a,x(k-1)--s.

.-u+(k-p).

(6)

The parameter estimation is achieved by minimising the following loss function: L(6,6,.

. . , a,, PI = k=ij+1[(x(k) -Al’.

(7)

NON-STATIONARY

MODELLING

OF

VlBRATI0.N

SIGNALS

359

For convenience, suppose parameter vector 0 = (a,, a2,. . . , up)? Given order p, 0 can be obtained by equation (8) R8=-r

(8)

where R = (r(j) is the covariance matrix. r = ( rol rij=

,

roz, . . . , roP)T

x(k-i)x(k-j).

f

(9)

k=p+l

Various algorithms can be found in the textbooks for the estimation of 9 (see review paper [13]). The Marple algorithm [14] is used in our research work, order p is fixed. This model is not as accurate as the time variant model given below, but the computing time is very short. 2.3.2. Time variant AR model The essence of this algorithm is that the AR model changes little with the input of each sampling point so the model always follows the local properties of the non-stationary signal. Bohlin [15] and Isaksson et al. [16] have applied this method to the analysis of EEG signals. Kitagawa and Gersch have also made a thorough investigation of this field 141. First suppose, the data vector X(k) is (-x(k-l),-x(k-2)

,...,

-x(k-p))r.

Equation (8) can be written as: [ i X(i)X’(i)]&k)

= $, X(i)x(i).

i=l

6(k) indicates that the parameters

-1 i=l 1

are time variant. The inverse matrix is then

P(k)=

[

i X(i)X’(i)

(10)

(11)

6(k) can be estimated iteratively in a Kalman filter process [17] as follows

&k)=$k-l)+P(k)X(k)(x(k)-XT(k)&k-1)) P(k)=P(k-l)-P(k-l)X(k)(l+X=(k)P(k-l)X(k))-’X(k)P(k-l).

(12) (13)

Equation (12) means that @(k- 1) will be updated to i(k) as soon as one new sampling point comes into the estimatitn. If parameter set 8(k) is constant, equations (12) and (13) guarantee the estimation O(k) converges to 8 when time k + co. For changing fJ(k), suppose 8(k) is constrained by a first-order Markov process as: e(k)=O(k-l)+pe(k)

(14)

where t.~.is the average change rate. The covariance matrix of vector we(k) is ~~1. The AR process described with the vector is x(k)=X’(k)e(k)+e(k).

(15)

Under the constraints of equation (14), the recursive equation (12) will reappear while ~‘1 will be added to the right of equation (13). The data produced in the above Kalman filter process is p times as much as the input data so that it is impossible to use the AR coefficients corresponding to every time k as primary monitoring features. Primary feature reduction is obviously necessary and this is discussed in the next section.

360

Q. ZHUGE

ET

.4L

3. FEATURE COMPRESSION OF QUASI-PERIODIC SIGNALS VIA ORTHOGONAL

TRANSFORM

3.1. DEFINlTION Generally, suppose a time variant model transformation

H( )

8(k)=H[x(k)]

where B(k)=[B,(k),

(16)

and time series x(k) is

8,(k), . . . , e,(k)],

x(k)=xO(k)+n(k).

(17)

x(k) is a periodic signal with a period N. n(k) is noise, which makes x(k) quasiperiodic. Suppose the parameter set 8(k) is also quasiperiodic. Then we have

b’(k) = f3(jN + k), b’(k) is the feature vector corresponding

k=O,

l,...,

N-l.

to time k of the jth period. b’(k) is written as

b’(k) = [(b:(k), b;(k), . . . , b;(k)]? The feature vector that consists of all primary feature components signal is y’ y’ = [b”(O), bjr(l),

. . , bjr( N - l)lT

Y’-P.JY.‘).

3.2.

ORTHOGONAL

(18)

(19) of one period of the (20) (21)

TRANSFORM

The feature component number of y’ is m . N. For example, if N = 200, m = 10, 2000 primary features should be used for describing one period of the signal. For the compression of primary features, an orthogonal transform is introduced by the authors to express the first and the second moments of the probability distribution &(yj). Let M(k) = E,[b’(k)] C(k)=Ej[(b’(k)-M(k)lT[bJ(k)-M(k))]

(22) (23)

So M(k) and C(k) are the mean value vector and covariance matrix respectively. Suppose function set {4i( k)} is orthogonal within the interval k = 0, 1, . . . , N - 1:

(24) The orthogonal transform is applied to express the mean value vector and the covariance matrix M(k)=

Z? M&i(k)

(25a)

i=l

( C(k)=

F Ci+i(k). i=l

(25b)

Mi and Ci are a constant column vector and constant square matrix respectively. The optimum approximation in the sense of least mean square error can be obtained with the the first q items, according to the generalised Fourier analysis theory [18]: M(k) = t Midi

(26a)

i=l

C(k)=

f i=l

Cidi(k),

k=O,l,...,N-1

(26b)

NON-STATIONARY

MODELLING

OF VIBRATION

SIGNALS

361

where Mi=+

lci M(k)+i(k)

I Ci =i

yi:_ C(k)&(k),

(27a)

i=l,2

,...,

q.

Wb)

Now we discuss the algorithm realisation.

The averaging operation is realised iteratively. When a new sample of feature vector b’+‘(k) is added to estimation, mean vector and covariance matrix estimates based on the previous j samples are M’(k) and C’(k), then M’+‘(k) =-jl

C’+‘(k)=-j.l

1 [jM’(k)+b’+‘(k)l

c

(284

[jC’(k)+jM’(k)M’(k)T+b’+‘(k)b’+‘(k)T]

k=O,l,...,Nj+l-1

-~[jM’(k)+b’+l(k)][jM’(k)+b’+l(k)]T, (j+l)

(28h) which satisfies to M’(k) = b’(k) I C’(k) = [O]. Notice that time k is from 0 to Nj+, - 1, which means that there are Nj+r rather than a constant N column feature vectors in set {bj”(k)}, so that a quasiperiodic process is defined. Are-sampling process is necessary to meet the needs of equations (28a) and (28b). Then let M’+‘(k)=M’(k)+ii,

C’“(k)=C’(k)+

(29a)

AMj4i(k)

i ACjdi(k)

(29b)

i=l

in which M’(k) and C’(k)

are obtained from the re-sampling process as: M’(k)=

i M;&(k)

(30a)

i=l

C’(k)=

i C{+,(k), i=l

t3Ob)

k=O,l,...,Nj+l-1.

Similar to equation (27), we have 1 N,+,-’ AM;=--C (M’+‘(k)-M’(k))+i(k) Nj+r k=O

(31a)

NJ+,-’

*c+-$,+, kzo tCj”tk)-C’tk))~itk),

i=l,2,.

. ., q.

@lb)

362

Q. ZHUGE

ET

AL.

Therefore, in each step of iteration, an orthogonal transform is applied to (M”‘(k) M’(k)) and (C”‘(k) -C’(k)) instead of M’+‘(k) and C’+‘(k). Finally we have Mj+‘=MJ+jM! I

C ;“=Cj+AC:,

-

Wa)

i=l,2

)...)

q.

(33b)

The advantages of the approach as proposed above are 1. The feature number is highly reduced. Now stored data is only q/N of the primary features. 2. The orthogonal transform with a limited order q is equivalent to a lowpass filter. A smooth approximation curve is obtained when the orthogonal transform is applied. 3. The method is particularly suitable for the quasiperiodic signal. The period intervals Nj cannot be equal to each other due to manufacturing error in mechanical components, the instability of the rotating speed and the influence of noise n(k). The regular averaging method cannot be used in the computation of M(k) and C(k). If an orthogonal transform is applied, M’(k) and C’(k) are re-sampled in each step of the iteration, which makes it possible to realise the ensemble average. There are a lot of orthogonal functions that may be used as {c#+(k)}. The sinusoid orthogonal function series {sin nkr/ N} (n = 1,2, . . , q, k = 0, 1, . . . , N - 1) is used in our study.

4. SIMILARITY ANALYSIS Similarity analysis for pattern recognition is introduced here for estimating the deviation of a response signal from its template. Suppose a time signal is input and is transformed by time variant model H( ). A set of primary features {g(k)}k=O,,,...,N,_, are obtained. Then the point-to-set distance between {g(k)} and template {b’(k)} - (M(k), C(k)) may be determined by the Mahalanobis distance [19]: d =$

;cO’ (g(k)-M(W7C-‘(k)(g(k)

-M(k))

(33)

where d is a scale function. The larger the value of d, the more diversified {g(k)} is from its template. Equation (33) provides us with a powerful measure for condition monitoring. It not only monitors the feature variation in a statistical sense but also takes the cocorrelation between the feature components into consideration.

5. EXAMPLE A typical reciprocating machine, a hydraulic piston pump, is investigated as an example using the method proposed as above. Figure 4 shows a time record of acceleration measured on the housing of a piston pump. The impulses excited by the movement of each of seven pistons can be easily distinguished. First the statistical model of the template which represents a normal working condition was established as follows: (1) The transform H( ) was realised with the time variant AR model based on the Kalman filter, the sampling speed was 100 kHz, the AR order p was fixed at 4 and the data length for AR input was 60 samples. (2) A search was made for the periods corresponding to the local maximum residue variances, the interval between these two periods acted as quasiperiod N.

NON-STATIONARY

MODELLING

OF

VIBRATION

363

SIGNALS

566.4143 I

I

I

Time (msec) Figure

4. Time record

of acceleration

response.

(3) Primary feature components

were composed of the AR power spectrum amplitudes at the following frequencies: 100, 2000, 3750, 6000, 8000, 10 000 and 12 000 Hz. The selection of these frequencies was based on the in depth investigation into the vibration sources and transmission properties of the pump, detailed discussion is omitted for reasons of space. (4) A large number of samples (more than 500) were recorded from ten pumps of the same type, M(k) and C(k) were estimated. A time-frequency spectrum array around the one period of time signal is shown in Fig. 5. It is impossible to obtain such a detailed description using traditional spectrum analysis, e.g. FFT power spectrum. Curves of several components of the mean vector M(k) are given in Fig. 6. Obviously, the curves of feature components that correspond to different frequency points are different. The time invariant algorithm was then applied to the signal from which the working condition was to be evaluated. The AR order p was 4 and input length was 60. The overlapping length of neighboring time spans was 40. Five tests were carried out for

I ’ 25 kHz Figure

5. Time-frequency

spectrum

array around

one period

of vibration

signal.

Q. ZHUGE

ET

AL

(b) 2 kHz

i-

E

a

0

T

Figure 6. The changes in the three components

of M(k) over time.

different conditions {g,(k)1 &z(k)1 {g3(k)} {e(k)} {g,(k)}

normal normal with a with a normal

condition condition fault of slipper looseness, looseness gap: 0.29 mm fault of slipper looseness, looseness gap: 1.33 mm condition, equipped with another type of valve. plate.

Mahalanobis distances are calculated according to equation (33) for the above feature sets: d8, = 3.35; dg2 = 7.62; d,, = 35.5; dg4 = 212.1 and d,, = 87.2. Clearly, distance measure is effective for judging the similarity between the input signal and its template. The subtle change in vibration signal, for example, {g,(k)} could not be detected using FFT spectrum analysis but this measure proved to be very sensitive.

6. CONCLUSIONS

Three problems are discussed in detail in this paper. The first is establishing a time variant model for vibration signals. It is shown that impulse-type responses will be produced in many reciprocating machinery systems and by many commonly used mechanical components. The time-dependent information that describes the working condition can be extracted with a time variant model. The AR model is a powerful means for non-stationary signal modeRing. Two approaches are proposed for AR coefficient estimation. One based on Kahnan filtering theory is used for establishing a statistical model of the template, the computing time is lengthy. The other based on the assumption of local stationarity of the signal, time invariant algorithm, is used for modelling the input signal and is very fast to r&se. The second problem is feature compression, an orthogonal transform is introduced to describe the first and the second moments of the template. An ensemble averaging operation is realised for a quasiperiodic signal with the help of an orthogonal transform.

NON-STATIONARY MODELLING OF VIBRATION SIGNALS

365

brevity, no further discussion of the choice of orthogonal functions or the order q is made. The third problem discussed is the introduction of pattern recognition theory. Similarity analysis is used to judge the deviation between the input signal and the template so that the monitoring problem is solved. We believe that diagnostic problems could be solved with a pattern classification technique, a method of pattern feature extraction and compression is presented in this paper. Distance d has a clear physical explanation, but the relation between the value of d and the severity of machinery condition has not been investigated in this paper. For

ACKNOWLEDGEMENT This research work was funded by the China Education Committee and the Open Laboratory of Fluid Power Transmission and Control of Zhejiang University. We are grateful to Professor Kitagawa for allowing us to see his research papers and reports. REFERENCES 1. M. S. HUNDAL 1986 Shock and Vibration Digest 18 (12), 3-10. Mechanical signature analysis. 2. M. S. HUNDAL 1983 Shock and Vibration Digest, 15 (6), 19-26. Mechanical signature analysis. 3. R. H. VOLIN 1979 Shock and Vibration Digest 11 (9), 17-33. Techniques and applications of mechanical signature analysis. 4. G. KITAGAWA and W. GERSCH 1985 IEEE Transactions AC-SO, 48-56. A smoothness prior time-varying AR coefficient modeling of nonstationary covariance time series. 5. M. R. PARTOFF1980 IEEE Transactions ASSP-28. Representation of digital signals and systems based on short-time Fourier analysis. 6. S. H. NAWAB 1983 IEEE Transactions ASSP-31, 986. Signal reconstruction from short-time Fourier transform magnitude. 7. D. S. K. CHAN 1982 Proceedings IEEE ICASSP, 1333-1336. A non-aliased discrete-time Wigner distribution for time-frequency signal analysis. 8. W. MARTIN and P. FLANDRIN 1985 IEEE Transactions ASSP-33, 1461-1470. Wigner-Ville spectral analysis of non-stationary processes. 9. S. THOMAS ALEXANDER 1986 Adaptive Signal Processing, Theory and Applications. New York: Springer-Verlag. 10. J. I. MAKHOUL and L. K. COSELL 1981 IEEE Transactions ASSP-29, 654. Adaptive lattice analysis of speech. 11. J. SEGEN and A. C. SAUDERSON 1980 IEEE Transactions IT-26, 249-255. Detection change in a time series. 12. R. CHARBONNIER, M. BARLAUD, G. ALENGRIN and J. MENEZ 1987 Signal Processing, 12,

143-151. Results on AR modelling of nonstationary signals. 13. S. M. KAY and S. L. MARPLE Proceedings ofthe IEEE, 69, 1380. Spectrum analysis-a 14. 15. 16. 17.

18. 19.

modern perspective. S. L. MARPLE 1980 IEEE Transactions 28, 441-454. A new autoregressive spectrum analysis algorithm. T. BOHLIN 1977 Mathematical Biosciences 35,221-259. Analysis of EEG signals with changing spectra using a short word Kalman estimator. A. ISAKSSON, A. WENNBERG and L. H. ZEI-~ERBERG 1980 Proceedings of the IEEE, 69, 451-461. Computer analysis of EEG signals with parametric models. K. J. ASTR~M 1970 Introduction to Stochastic Control Theory. London: Academic Press. R. COURANT and D. HILBERT 1983 Methods of Mathematical Physics, Vol. 1. J. T. Tou and R. C. GONZALEZ 1974 Pattern Recognition Principles. Reading, MA: Addison-

Wesley.