Multi-dimensional intelligent sensing system using sensor array

Multi-dimensional intelligent sensing system using sensor array

Sensors and Actuators A 35 (1992) t-8 using sensor array Multi-dimensional intelligent sensing s Hiro Yamasaki Faculty cf Engmeermg, Unmersvv of Tk...

683KB Sizes 0 Downloads 43 Views

Sensors and Actuators A 35 (1992) t-8

using sensor array

Multi-dimensional intelligent sensing s Hiro Yamasaki Faculty cf Engmeermg, Unmersvv of Tkvo Bongo, Takvo. 113 (Japan)

Yukio Hiranaka Yamagata Unwersiry, Yorezawa (Japan) (Received February 21 1992, accepted march

20, 1992)

Abstract In this paper two different kin s e two-dimensional senior array for s is image and the global structure of mot point-type multi sensors based on a model us the reality of the mvisable state These seism dynamic state

I

Introduction Present sensor technology covers a wide range of physical and chemical quantities Most sensor devices can sense a certain quantity at a single point where the sensor is installed . but they cannot sense the whole quantity distributed over a wider space A single sensor device cannot identify the difference of the quantity in the x-, y- or z-coordinate in the space Sometimes the spatial distribution of a certain quantity is the problem to be investigated Also the temporal change in the distribution may be of interest For instance, in wave propagation phenomena, we are usually interested not only in the wave intensity at a single point but also the position of the wave source or the direction of wave propagation We obtain detailed information of the wave intensity by observing the waveforms at a certain point, on the other hand, we can obtain the spatial distribution of the wave intensity by observing wavefronts For visible objects, image sensor devices are very common for describing the spatial distribution and

i

to

unde tending o

USA June 24-28, 1991

0924 .4247(921$5 00

t

ensional

are discusse ao discuss them farther When the object is t i te cannot be grasped easily Thus a signal- oc essing system for visualization of the object state is useful Intelligence for the signal processing plays an important role in the man-machine interface We usually describe chemical information in terms of the molecules or species involved and their concentrations, which are functions of space and time Most chemical sensors or analytical instruments transmit only the concentration signal of a certain spatial point at a sampled instant Figure 1 shows the concentration of a chemical species as a three-dimensional state instead of a concentration time

G

P

=t

*Paper presented at the 6th international Conference on solid. State Sensors and Actuators (Transducers '91). San Fiarntsoo CA,

ml

Fig I The concentration mufti-dimensional state

(,O

of a chemical

1992 ---

0

species can be described

Elsevier Sequoia

All

by

a

nods reserved

multidimensional state In the Figure, the x-axis corresponds to the position in space, the y-axis represents the time axis and the z-axis represents the concentration of the chemical species A curved surface in the three-dimensional space, F(x, y, z) = F(p, t, c) = 0, represents all the information about the concentration of the chemical species, where p, t and c denote position, time and concentration, respectively An intersection that is perpendicular to the x-axis and parallel to the yand z-axes (hatched in the Figure) makes a curved line [G(t, c) = 0 at p =p o ] The curve G(t, c) = 0 describes the temporal trend of the concentration at a fixed point p o Another curved line H(p, c) = 0 at t = to represents the spatial distribution of the species at time to Both these curves do not provide all the information about the species We can obtain complete objective information of the spatial dependency and temporal trend relating to the material of interest if multidimensional measurement techniques are available We describe a chemical sensing system for visualization The system collects the output of gas sensors installed at each nodal point of a lattice array and calculates the spatial distribution by an interpolation based on the diffusion model of the gas The results are displayed in real time on a CRT overlapping with a video image of the space being studied The diffusion process of a gas can be visualized in real time We can localize the source of an odour by this visual distribution of the gas concentration An acoustic field is also multi-dimensional Visualization of a sound wavefront is also achieved by computer interpolation based on a wave-propagation model and a two-dimensional microphone array The intelligence of these systems gives us the global structure of multidimensional phenomena by a visualization technique that organizes local information from pomtlike multi-sensors

Sensing of a gas-diffusion process Multidimensional sensing is necessary for quick detection and identification of diffusing chemical materials in a complex environment For example, when we want to localize the correct position of a domestic gas leak, we have to measure the spatial

distribution of the gas and its temporal change The reason for this is that the leaked gas will diffuse into the air and its spatial distribution will be influenced by the local air flow In such a situation the leaked gas will significantly change its distribution spatially and temporally, so that spatial scanning by a single point sensor cannot catch the correct source of the leakage Therefore, a multi-dimensional sensing system that takes broad, simultaneous and continuous gas-density measurements is necessary Our research is aimed at developing such a multidimensional visualization system for various gas distributions in an environmental atmosphere Visualization is a prompt and robust way of understanding the global situation of a gas-diffusion field, which is invisible As the gas distribution changes its shape due to various factors, such as diffusion, convection and air flow, our visualization system has intelligent functions that use various techniques adaptively to fit the actual environment A two-dimensional array of metal oxide semiconductor gas sensors is used in our experimental system The outputs of the sensors are processed by a digital computer to form a visualized gas-distribution image that is suitable for human understanding

Experimental visualization system with gas-sensor array 11, 2,41 A visualization system is built using a twodimensional semiconductor gas-sensor array of 64 sensors The sensor devices are of Figaro TGS705 type Figure 2 shows the system with the 8 x 8 sensor array The outputs of the sensors are digitally processed to form images of the spatial gas distnbution on a CRT The image of the scene is taken by a video camera These two images are overlaid and displayed on a CRT for real-time recognition and easy understanding of the gas field Figures 3 and 4 show the temporal and spatial frequency spectra, respectively, which were measured in our laboratory room According to these Figures the sampling time interval of 0 3 s is appropriate and 0 2 m is enough for the spatial sampling interval The gas-distribution image is formed by linear or bilinear interpolation, which is



3

that looks like a white cloud indicates the position of the vapour source Figure 5(c) shows the distribution of ethyl ether vapour that is diffused in the vertical direction These images are expected to be of great help for localizing the source of gas leakage when they are overlaid on the visual image of the scene taken by a video camera Figure 5(d) shows a distribution of odour from the human body The odour seems to come from the socks The original pictures are all coloured and the objective states are much more clearly grasped than black and white ones are

gas sensor array

digs at compu e Fig 2 A molts-dimensional sensing and visualization system using an 8 x 8 gas-sensor array

Sensing and visualization of a sound field 13,41 dB 0

-20

-40 t 025

l 1 4Hz

Fig 3 Temporal frequency spectrum of gas diffusion process measured at a gas sensor

0 .5

1/ni 0 0

5

l 10

Fig 4 Spatial frequency spectrum of gas diffusion process

described later It takes about 0 2 s to draw one image plane in our current system It also uses a pseudo-gray-level display scheme, which somehow obscures the unnaturalness of the interpolation Figure 5(a) shows sequential images of ethanol vapour distribution It displays the spatial and temporal changes in the vapour concentration every second Figure 5(b) shows an overlaid image of ethanol vapour The vertical and horizontal spacing of the sensors is 0 2 m The central part

In conventional sound-propagation experiments the sound pressure is measured point by point with microphone scanning If we could visualize sound wavefronts and see how they propagate, the results would be useful for acoustic research and development A real-tune sensing and visualizing system is realized using a two-dimensional microphone array, so that the requirement for microphone scanning is completely removed Two major problems should be solved regarding the response of the system The system should be fast enough to respond to sound propagation in real time, and at the same time the moving speed of the visualized image should be slow enough to match the visual perception rate of man The other problem is the limitation on the number of array elements The number is determined by considering the computation time and the disturbance on the measured sound field These problems can be overcome by the intelligence of the system The speed of wave propagation can be shifted to a rate that man can recognize by special hardware One example of such hardware is a phase-lock loop (PLL) circuit that traces the sound frequency and provides a slower trigger signal for stroboscopic display of the wavefronts Computer interpolation is used to make a natural image from the sparse data from the spatially sampled array sensor elements Figure 6 shows the configuration of the system The object sound field is sensed by an 8 x 8 twodimensional microphone array (14 m x 1 4 m) The spacing between two microphones is 0 2 m,

4

t-1

t=2

t-3

t=4

t-5

t-7

t=8

t=9

t=10

(b)

(c)

(d)

Fig 5 (a) Sequential image of ethanol vapour distribution at I s time intervals (b) Overlaid image of ethanol vapour A vapour source is located in the centre of the array (c) Overlaid image of ethyl ether vapour A source is located in the centre of the array (d) Odour from a human body



5

per second, the wavelength every The images of overlaid on the display helps our

wavefronts advance 1/10 of a frame the calculated wavefronts are scene by a video camera The understanding of the sound field

Interpolation procedures

Fig 6 Multi-dimensional sound sensing and visualization system

t

phase corn

Me +1

M

Fig 7 PLL circuit tracks signal frequency and divides the frequency by a suitable fraction Thus it adjusts sampling rate of sound

this gives the upper limit of the sound frequency as the Nyquist frequency, 1700 Hz The microphones are of FET built-in electret condenser type (Matsushita Tsushin WM-063Y, 6 mm diameter, 5 mm long) The variation in unit characteristics is less than 0 5 dB in amplitude and 15' in phase at 850 Hz The input signal need not be a pure tone but should be periodic A PLL circuit is used to track the signal frequency and then to adjust the sampling rate of the sound pressure (Fig 7) The sound images can be renewed in a synchronized way to show the advance of the wavefronts Adjusting the sampling period T, to be Ti = (n + 1/m)T0

The interpolation procedures for obtaining a contour map of the wavefront are discussed here Three different ways of interpolation are tested linear, bilinear and smc interpolation Linear interpolation is a simple method that adopts a linear approximation for each of the connecting lines between adjacent array elements It connects the points of equal phase value on those lines with straight lines If there exist only two points for the specified value on an element rectangle, the line is uniquely defined If the number is one or three, it should include a vertex point and the vertex need not he connected in that element rectangle If the number is four, there are three choices, as shown in Fig 8(i)-(ui) Crossings are less acceptable visually In this case, we decided to draw the pair of lines that are separated the farthest (Fig 8(ui)) Bilinear interpolation approximates the values within each element rectangle by a bilinear equation f(s, q) = a + (b - a)c + (c - a)q +(a+d-b-c)sq 0<,e<1,05q'(1 (2)

where a and q are normalized x, y coordinates in each element rectangle and a, b, c and d are the values at the vertices (Fig 9)

(1)

where n, m are integers and To is the period of the input signal, the sound image advances l/m of a wavelength for each T, Parameters n and m can be determined by considering the computation time, the frequency of the sound field and the perception rate of human eyes For example, with n = 200, m = 10, we can visualize a 400 Hz signal at a rate of two frames

I (I)

(II)

(ili)

Fig 8 Linear interpolation it connects points of the same value with straight lines There are three ways of possible connection if four points are to be connected The system selects the pair of lines that are furthest apart



6

C

d

1 (1)

n

f (e, 7)

(u)

a

b

0 0

e

1

Fig 9 Bilinear interpolation based on eqn (2) for each element rectangle The values a, b c and d are obtained at each microphone unit

f f

Contour lines with a certain constant value can be determined by eqn (2) by substituting for the left-hand side If we take as 0, the lines will have a similar meaning as in Fig 8 In the bilinear interpolation, lines connecting points of equal value are generally curved Sinc interpolation uses a sin v/x function interpolating the value on (x, y) as

f

sin n(x - i) sin ir(y -j) f(x Y) =~~ d(r, ])

n(x - i)

05x7,0
(a)

(b)

Fig 10 Comparison of the interpolation procedures (I) original (u) linear, (in) bilinear and (iv) sine (a) Ideal spherical pattern (b) Calculated diffraction pattern caused by a wave source at bottom left corner and an absorbing half wall at the centre

n(Y - J) (3)

where d(i, J), i = 0, 7,j = 0, 7 is the value at each microphone element This calculation can be done by using a fast Fourier transform (FFT) algorithm because the sine interpolation is equivalent to an expansion of the frequency band adding zeros as higherfrequency components Wavefronts of zero pressure can be drawn by marking the points that change the sign of the interpolated value f(x, y) The sine interpolation is the best interpolation if the number of array elements is sufficiently large and the frequency band of the visualized sound is limited below the Nyquist frequency, which is determined by the spacing of the array When the size of the array is finite, most errors will appear near the edges of the array Figure 10 shows two sets of comparisons using the three interpolation methods (u) linear, (m) bilinear and (iv) sine interpolations, with figure (i) showing the original field images Figure 10(a) shows an ideal spherical pattern, while Fig 10(b) shows a calculated diffraction pattern caused by a wave source at the bottom left corner and an absorbing half wall at the centre The linear interpolations (ii) are not natural at every connecting

point However, their overall impressions are natural and near to the original fields (i) Using the bilinear interpolation, the lines are smoothly curved in each rectangle (iii) Nevertheless, connections at the edges of the element rectangles are not smooth The sinc interpolation (iv) gives smooth and natural images However, it requires much longer computation time than linear interpolations, so that smc interpolation cannot be used in real-time visualization without a high-speed FFT processor Computation time is an important factor for real-time systems The typical interpolating ratio (display points/data points) of our system is (197 x 176)/(8 x 8) A calculation can be done in about 140 ms per frame The total processing time for one frame is 167 ms, including data acquisition and data transfer in the case of linear interpolation A frame rate of about six per second can be achieved

Experimental results of visualization of sound wavefronts Figures 1 I and 12 are photos of the CRT image In Fig 11(a), a sinusoidal wave of 720 Hz is

7

(a)

Fig 12 Image of sound propagation from a man that is overlaid an the video picture of the scene Wavefronts are calculated by sine interpolation

emitted from a loudspeaker at the bottom left corner damped

The measurement was done in a room lightly by thick fabrics on the walls

Therefore, the field is different from an ideal field The CRT display shows a clear image of the field the spacing of the lines is approximately 23 cm, which corresponds to half of the wavelength The pattern generally shows wavefronts spreading from the source at the bottom left corner The disorder in the pattern at the upper right corner indicates the influence of an echo from the wall and ceiling of the room Figure 11(b) shows another example of a similar situation with 500 Hz sound, in which the positive sound-pressure areas (b)

are painted Figure 11(c) shows sound diffraction due to an absorbing half wall in the centre Figure 12 shows the case of a human voice, with circular wavefronts originating at the mouth As the human voice can be assumed to be periodic for a short time, at least for several seconds, we can see the movement of the speech sound on the CRT In Fig 12 the lines come from the sinc interpolation This image was actually calculated off-line (about 10 min by our computer) Such smooth lines will be preferred in examinations with still pictures

(c)

Fig I I (a) Image of wavefronts of 720 Hz emitted from the loudspeaker at the bottom left corner (b) Image by dtaerent display mode Positive pressure areas are painted (c) Image of sound diffraction caused by an absorbing half wall in the centre

8 Future development

References

In the future a fully multi-dimensional sensor system for gas components will be constructed using sensors that can identify gas species Distribution patterns of the different species will be displayed in different colours Another sophisticated sensor system is also expected, in which a gas source outside the sensor plane can be determined by using an adaptive diffusion model and temporal information from the sensors

I Y Hiranaka, M Some, M Mishima and H Yamasaki . Visualizing system of spatial gas distribution, Proc 26th SICE Ann Conf, SICE '87, Huoshima, Japan, July 1987 . pp 617-618 (in Japanese) 2 H Yamasaki and Y Hiranaka, Multi-dimensional intelligent sensing system, presented at the 1989 Int Chemical Congr of Pacific Basin Societies, Honolulu, Hl, USA, Dee 17-22, 1989 3 Y Hiranaka, 0 Nishu, T Genmaa and H Yamasakt, Realtime visualization of acoustic wavefronts by using a twodimensional microphone array . J Acoustic Soc Am , 84 (1988) 1373-1377 4 Y Iltranaka and It Yamasakt, Field visualization using a sensor array, Traps SICE, 28 (1) (1992) 166-167 (in Japanese)

Conclusions Multi-dimensional intelligent sensing systems have been described A sensing and visualization system for spatial gas distribution by semiconductor gas sensors has been discussed The gas distribution can be effectively visualized by using a two-dimensional sensor array system An acoustic field is also multi-dimensional Visualization of a moving sound wavefront has also been achieved utilizing a computer interpolation based on the wave-propagation model and a twodimensional microphone array The moving speed can be artificially adjusted The intelligence of these systems and the human visual system give us the global structure of multidimensional phenomena by organizing local information from pointlike multi-sensors with the help of video images Both sensing systems may be useful for understanding a multi-dimensional dynamic state and may act as a new intelligent man-machine interface Acknowledgements The authors would like to express their thanks to Mr M Some, Mr M Mishima, Mr 0 Nishu and Mr T Genma for their help in this work

Biographies Hiro Yamasakt was born in Tokyo in 1932 He received a B E and his doctorate from the University of Tokyo in 1956 and 1972, respectively He worked for Yokogawa Electric Corp as a research physicist and R&D manager from 1956 to 1974 He was a lecturer at the University of Tokyo from 1972 to 1974 Since 1975 he has been a professor in the Department of Mathematical Engineering and Information Physics, Faculty of Engineering, University of Tokyo He was president of the Society of Instrument and Control Engineers Japan (SICE) from 1989 to 1990 He is a member of IEEE, SICE and IEEJ Dr Yamasaki was a program co-chairman for Transducers '89 His research interests are intelligent sensors and sensing systems Yukio Hiranaka received B E, M E and D Eng degrees, all in instrumentation physics, from the University of Tokyo, in 1976, 1978 and 1988, respectively Since 1989 he has been at Yamagata University His research interests include field visualization, intensive information extraction from sensors and computer network applications