An approach to the segmentation of textured dynamic scenes

An approach to the segmentation of textured dynamic scenes

COMPUTERVISION,GRAPHICS,AND IMAGEPROCESSING21, 239-261 (1983) An Approach to the Segmentation of Textured Dynamic Scenes S. N . JAYARAMAMURTHY AND RA...

4MB Sizes 63 Downloads 32 Views

COMPUTERVISION,GRAPHICS,AND IMAGEPROCESSING21, 239-261 (1983)

An Approach to the Segmentation of Textured Dynamic Scenes S. N . JAYARAMAMURTHY AND RAMESH JAIN* Intelligent Systems Laboratory, Department of Computer Science, Wayne State University, Detroit Michigan 48202 Received March 8, 1982 An approach to the segmentation of dynamic scenes containing textured objects moving against a textured background is presented. This multistage approach first uses differencing to obtain active regions in the frame which contain moving objects. In the next stage, a Hough transform technique is used to determine the motion parameters associated with each active region. Finally, the intensity changes and the motion parameters are combined to obtain the masks of the moving objects. Our experiments illustrate the efficacy of the approach for moving textured objects even in the presence of occlusion. An indicator that signals the presence of the rotational component of the object motion can also be extracted. 1. INTRODUCTION

Fast and reliable motion detection is an important function of any vision system. In a biological vision system, it is vital to the survival of both the individual organism and the species [8]. Computer vision systems having this capability can find innumerable applications in industrial, medical, and military environments. In recent years, many prominent researchers [1, 5, 7, 9-11, 14-17, 19-21, 23] have directed their efforts to developing schemes for motion analysis for implementation in computer vision systems. Because of their efforts, a fast growing discipline, dynamic scene analysis, has emerged in the field of computer vision. Texture is ubiquitous in real world scenes. To date, however, it has been ignored by many methods developed for motion analysis. Some motion analysis techniques require prior segmentation of images [1], which in the presence of texture becomes expensive, difficult and even impossible in some cases. Sharp edges commonly found in textured scenes may upset velocity estimation schemes [7]. Techniques using edge features for matching [21] may find too many edge points while difference picture based approaches may not find a compact region either to initiate the growth of a region [10] or to classify it [11]. In this paper, we present an approach to recovering the masks of objects moving in textured environments. Our method, referred to as the "shift-match method," does not require prior segmentation of frames. It comprises three stages. The first stage is based on difference picture analysis and is primarily used for motion detection. We extract active regions containing moving objects. In the second stage, the direction of planar motion associated with each active region is extracted using a more global technique such as a Hough transform. The third stage is the segmentation process which combines the information obtained from the earlier stages to extract the masks of moving objects. *Present Address: Department of Electrical and Computer Engineering, University of Michigan, Ann Arbor, Michigan 48109. 239 0734-189X/83/020239-2353.00/0 Copyright© 1983by AcademicPress,Inc. All fightsof reproductionin any formreserved.

240

JAYARAMAMURTHY AND JAIN

Our experiments demonstrate the success of this method even in cases where several textured objects are moving against a textured background in different directions, often occluding each other. In some cases static segmentation of individual frames is impossible even for the human visual system. In the following sections, we describe our shift-match method and demonstrate the technique with a number of examples. The strengths and weaknesses of the approach are discussed in the last section. 2. PROBLEM DESCRIPTION

2.!. Input A dynamic scene is presented to the system in the form of a registered frame sequence. Each frame is a two-dimensional array of integers representing the digital version of the continuous images obtained by the sensor. Such a frame sequence is represented as a function F(x, y, t) which represents the intensity at point (x, y) in the frame sampled at time t. Since we are considering a sequence of digitized images, x, y, and t are integers representing the column, row, and frame numbers, respectively.

2.2. Output The system analyzes this input. Using motion as a cue, it segments each individual frame into a stationary background and masks of moving objects. Any component of the scheme that does not exhibit motion will be classified as background. Further segmentation of the stationary component is beyond the scope of this method. For the moving objects, the method gives the translational displacement between two consecutive frames and some idea about the presence of a rotational component.

2.3. Assumptions (1) A single, stationary camera is used as the sensor, and the lighting conditions are held fairly constant. These assumptions ensure that any significant change in intensity values will be due to the motion of objects only. (2) Definition of a moving object: As the form and the size of the moving object are usually unknown, we adopt the following notion of an object which is due to Nagel [16]. A group of connected regions which are jointly displaced in a systematic way from frame to frame without changing their relative positions is considered to represent the image of a moving object. (3) The maximum interframe displacement of any object is limited. The fact that the approach is based on models for shortrange motion detection [2, 3] forms the basis for this assumption. By controlling the sampling time interval one can easily satisfy this condition. (4) The objects in motion are rigid and the motion component is predominantly translational. It is assumed that when rigid objects are moving in 3-D space, the effects of rotation and z-axis translation on the object images are much less than that of planar translation. The interpixel relationships on a rigid body remain invariant during translational motion. This property is exploited by the system. Later, this condition is relaxed to include planar rotation also. Assumptions 3 and 4 are required tO constrain the parameter space. It is to be noted that many researchers in dynamic scene analysis have either explicitly or

SEGMENTATION OF TEXTURED DYNAMIC SCENES

241

implicitly used some or all of the assumptions stated above [1, 7, 9-11, 16, 19, 20, 22, 231. 3. DESCRIPTION OF THE SHIFT-MATCH METHOD

We operate in the following three stages on a pair of consecutive frames. For the sake of convenience, we shall refer to the frames as Fe, the current frame at time t = k, IF(x, y, k)], and Fp, the previous frame, IF(x, y, k - 1)]. Every frame in the sequence, except the first and last, will be processed first as a current frame; later, as a previous frame. 3.1. Extraction of Active Regions

This stage is concerned with motion detection and is similar to the one used in [11]. We obtain a binary difference picture, D(x, y, k), by comparing the current frame F(x, y, k) with the previous frame F(x, y, k - 1):

O(x,y,k)= 1, = 0,

ifandonlyif

IF(x,y,k-1)-e(x,y,k)l> TrlR1

otherwise,

where THR1 is the preset threshhold value, usually 10% of the peak intensity vabe found in the frames. As per our original assumption of constant illumination, 1-entries in the difference picture are due to object motion. Connected regions of 1-pixels contain partial images of moving objects. There may be small holes present in the connected regions due to noise. We fill these holes by expanding and contracting the difference picture by a unit distance. Next, we filter out small regions which may be due to either noise or very slow motion [11]. This task is effected by eliminating all 4-connected components of 1-pixels of size less than another preset threshold (THR2). The surviving 4-connected components are called active regions (AR) and the binary picture containing only active regions is called the AR picture. This stage is implemented by the sequence of operations depicted in Fig. 1. This stage may be considered as accomplishing a crude segmentation with the active regions representing the nonstationary components of the scene. The only difference between this stage and the motion detection phase in [11] is in the fi|Hng of holes. The motivation for filling holes so early in our approach becomes clear after we discuss subsequent stages. 3.2. Stage 2: Estimation of Direction of Motion

The main difference between motion and other forms of spatio-temporal modulation is the presence of directional components. In this stage, we estimate the directional component associated with each active region by using a Hough trans-

Fp

;

Difference Picture

Fillhole

Fc THRI

FIG. 1. Extraction of active region picture.

Region Filter THR2

242

JAYARAMAMURTHY AND JAIN

form technique. The feasibility of such an approach has been demonstrated in our earlier papers [12, 13]. Jain et aL [10] do not attempt to compute the direction of motion. It is at this point that we significantly differ from their difference picture based approach for dynamic scene segmentation. Hough transform techniques have been used by several authors to estimate various parameters of interest in scene analysis [4, 6, 17]. Fennema and Thompson have used it to compute velocity information [7]. They have extended Limb and Murphy's method [14] in which the velocity of a point (ds/dt) was related to the time variation of intensity (di/dt) and the spatial gradient (di/ds) at that point: ai/at =

a /as,

as~at.

The time variation of intensity (di/dt) for each point can be measured by comparing consecutive frames, while the spatial gradient (di/ds) can be estimated with the Sobel operator in each frame. If, for any point, the time variation of intensity is significant (i.e., Idi/dtl > al), while the spatial gradient remains fairly constant over the time duration between frames (i.e., Ildi/dsl at (k - 1) - Idi/dsl at (k)l < az), the above relation places constraints over the possible values of velocities, i.e., ds/dt, that can be assumed by the point. For each such observation point, all possible velocity values are computed and used to increment an array of accumulators (initially set at 0) that are indexed by discrete values of parameters Vx and Vy. Well-defined peaks in the accumulator array correspond to velocities of moving objects. This method works well provided there is a sufficient number of observation points satisfying the above criteria. Usually, the image function is smoothed using blurring to increase the number of such points [7]. This operation reduces noise and tends to smooth rough edges. Blurring causes loss of resolution, however, and velocities of individual points are no longer computed, This may cause problems when a textured object is moving against a textured background. Also, to be effective, the amount of blurring should depend on the contents of frames and the maximum expected velocity. We eliminate these restrictions by approaching the problem from a different point of view. Let (x, y) be the observation point on a surface that is moving with a velocity (V~, Vy). Assuming that the frames are sampled at every unit time interval, we obtain

F(x, y , k ) = F ( x -

Vx, y -

Vy, k -

1).

(1)

To state it another way, if a point x, y in frame k is to be traveling with velocity (V~, Vy) between frames, it is necessary for it to satisfy the condition

IF(x, y, k) - F( x - Vx, y - Vy, k - 1)1 < ~,

(2)

where c is a preset threshold and is required to account for quantization and other types of noise. Using this condition, we can estimate the most likely velocity of a moving surface. We compute possible velocity components by creating an array of accumulators which are indexed by discrete values of V~ and Vy. The accumulator cells are

SEGMENTATION OF TEXTURED DYNAMIC SCENES

243

initialized to zero. For each observation point (x, y) in frame k, we find all possible velocities of (V~, Vy) which satisfy Eq. (2). The corresponding accumulator cells are incremented. If several points are moving with the same velocity, they contribute to a prominent peak in the accumulator array. We have to exercise some caution before using this approach. Although condition (2) is necessary, it is not sufficient. If the observation point were to be in the middle of a sufficiently large and uniform surface, it could have entries over the entire parameter space. For this reason, we consider only those points as observation points which have exhibited some motion. Points from active regions satisfy this condition. This approach may be extended to deal with multiple moving surfaces by using the multipass procedure suggested by Fennema and Thompson [7]. It is, however, a time consuming process to restart the counting process after each pass. We avoid this problem by creating a separate accumulator array for each active region. Since an active region has usually resulted from the motion of a single object, the position of the peak in the corresponding accumulator array gives the velocity estimate of that object. If a single active region is due to multiple moving objects, there will be multiple peaks in the accumulator array. This situation will require multiple passes for the extraction of masks for individual objects. It is possible that more than one active region may result from the motion of a single object [11, 23]. In such a case we shall set up as many accumulator arrays as the number of active regions that resulted from the motion of that object. The rigidity assumption insures that velocity estimates from each one of the arrays for an object will be identical and this fact will be taken into consideration while developing the mask for the object. Implementation of Stage 2

The discrete values of Vx and Vy are represented by the integer parameters (i, j). The parameter space is restricted by choosing a maximum value for velocity. For a given set of (i, j ) parameters, the previous frame is shifted by that amount to obtain Fpo', the shifted previous frame [F(x - i, y - j , k - 1)]. A difference picture is obtained by comparing the Fp(/with the current frame F~. This binary picture is complemented and it contains 1-entries at all points which satisfy Eq. (2). We now filter out 4-connected components of 1-pixels with membership less than THR2 (10 in our case) to eliminate regions that are too small for consideration as images of objects. The resulting picture is called the shift-match picture for i, j parameters, SM/j' (Fig. 2).

(i,j)Shlft Parameters

Fp ~ D i f f e r e n c e Picture Fc •

Region Filter

Fillhole

THR2

THRI Inverter FIO. 2. Extraction of shift-match picture.

; SMIJ

244

JAYARAMAMURTHY AND ,lAIN

It can be seen that any reasonably large ( > THR2) set of points which are 4 connected, and are jointly displaced by (i, j ) units from their positions in previous frame, would contribute to 1-entries in S M i j i n locations corresponding to their new positions in current frame. A minor problem is that a stationary region may also contribute to I-entries to SM/j if it is sufficiently large. It is possible to filter out such contributions by using a method discussed in the next section. M a t c h - I n d e x Matrices

A separate two-dimensional array is established for each active region in AR picture. We compute the entries for the (i, j ) t h cell in each array by processing SM/j together with the AR picture. This operation filters out the contribution of stationary regions to SM/j before filling the arrays. Let R t be an active region in AR and n t be the number of points in R t. We count the number of 1-entries in SM/j which occur at all points corresponding to those in region R t. We normalize this count by expressing it as a percentage of n t in order to make it independent of the size of R t. This normalized count is entered in the (i, j ) t h cell of the l array. This entry is called a match-index and the arrays are called match-index matrices (MIM):

MIM,(i, j ) =

(100/n,)*

E

SMO'(x, y).

(x,y)~Rt

If the entire active region R 1 is the result of the motion of a single object, the maximum value of the match index will be close to 100 and will occur at a displacement equal to that of the object. For various reasons mentioned in the next section, the maximum value of the match index may be much less than 100. The parameter values (i, j ) are varied over the entire parameter space and all the match-index matrices are computed. Direction of Motion

After all MIMs are computed, we search for the maximum value of the match index and its location in each MIM. The location gives the estimate of the directional component and the maximum value of the match index gives the relative confidence factor associated with the directional component for the selected active region. For example, if the maximum value of the match index found in MIM t is fairly high (say > 90), the corresponding (i, j ) value marking the location of the maximum gives estimates of the translational components of the velocity V~ and Vy, respectively, of the moving surface that caused the lth active region. The presence of a well-defined maximum indicates an unequivocal determination of V~ and Vy. The maximum value of the match index in an MIM may not be very high in many cases. A low value may be attributed to one or more of the following conditions: (1) inadequate parameter space, (2) occlusion, (3) rotation, (4) motion in depth, noise, etc.

SEGMENTATION OF TEXTURED DYNAMIC SCENES

245

Inadequate Parameter Space If the maximum occurs near the rim of the restricted space and if it is not high, our estimate of the maximum value of velocity of any moving surface may be in error and we need to choose a higher value for it, thereby enlarging the parameter space. It appears that this problem may be solved if we use adaptive methods [17] in a relatively large space. We have not explored this possibility.

Occlusion It is possible that an active region has resulted from the motion of at least one moving surface which may be occluding one or more moving surfaces. In such cases, the position of the maximum gives the directional component of the surface that is contributing the largest area to the active region and the maximum value of the match index corresponds to the percentage of such contribution. In such cases, we split the active region by returning to the image space and isolating all the points that contribute to the dominant peak in the parameter plane. If the remaining points in the active region constitute a large enough connected region for further consideration, we create a new accumulator array for each chunk and repeat this process. In some cases, secondary peaks may be easily detected in the original MIM, provided they are not masked by the dominant peaks. If so, by returning to the image space, we can identify the subregions of the active region that contributed to each secondary peak. The position of each secondary peak gives the directional component of the subregions. The results in Section 4 will illustrate the efficacy of our approach in the presence of occlusion.

Rotation Thus far we have considered translational motion only. If the motion contains rotational components (planar as well as nonplanar), we may not be able to obtain a high degree of match with a shifted previous frame, as indicated by the modest values of the maximum. Further, in general, the peak will not be sharp since we expect near-equal matches for the nearby shifts. In some cases, we might even obtain a plateau or a ridge over which the maximum value is nearly invariant, rather than a pinnacle. The exact shape of the plateau around the maximum value depends on the shape of the object, the amount of rotation and the gray level characteristics on the object surface. We can obtain some idea about the presence of a rotational component in object motion by determining ,r, the degree of translational mismatch. The percentage of points in the active region which do not move with the estimated translational component is measured by ~- and is easily obtained by subtracting the maximum value of the match index from 100. Using Fp as the reference frame, we compare it with a few consecutive frames beyond the current frame ~ and compute ~ for the corresponding active regions. If we assume smooth and continuous motion, the ~value remains fairly constant over a short time for purely translational motion. But it gradually increases with time if the motion contains a rotational component. We obtain a r plot when we suspect rotation and the gradual increase of • with time signals the presence of a rotational component. We present some results in Section 4 illustrating this effect.

246

JAYARAMAMURTHY AND JAIN

The position of the maximum (if it is wall defined) or the position of the center of gravity of the area over which the match index remains nearly maximum may be used as an estimate of the translational component of the motion. If we isolate this component, we can determine the rotational component using a rotate-match procedure which works along the same lines as the shift-match technique. The previous frame is shifted by the estimated amount of the translational component and rotated in discrete steps of 0 to obtain a match with the current frame. The peak in the parameter space gives an estimate of the rotational component. We are currently working to implement this scheme.

Motion in Depth, Noise, etc. Sometimes the largest value in the MIM is too low. This condition can occur in the presence of noise. It may also occur if some or all of the assumptions are violated, e.g., there is a change in the lighting conditions, movement by the camera, or motion in depth. In some cases, if the object surface contains texture, for certain values of rotation, we may not obtain any entries in the SM/j picture for any shifts. In all such cases, the largest value in the MIM may be too low, and we shall not be able to determine the directional component for the active region.

3.3. Stage 3: Segmentation As mentioned in the preceding section, the AR picture obtained in Stage 1 using the differencing method gives a crude segmentation. In Stage 2, we obtained the directional component associated with each active region using the shift-match method. In the third stage, we combine the information provided by both stages to obtain a more refined segmentation. Let R t be the lth active region in AR, (i l, Jr) the directional component associated with Rt, and SMitj 1 the shift-match picture for (it, Jl) shift. We construct an image St, which will be used as a guide to develop a mask for the lth object whose motion has contributed to RI. We assign to all points in St which have 1-entries in the corresponding location in SMi/jt: (a) label 3 if they are inside region R~ and (b) label 1 if they are outside of R t. Among the remaining points, we assign a label * to those which have 1-entries in corresponding locations in the AR picture. The remaining points are given label 0.

3,

if

[SMiljt(x, y) = 1 and (x, y) ~ R,], [SMi,jt(x, y) = 1 and (x, y) q~ R,], [SMi,jt(x , y) = 0 and AR(x, y) = 1],

=1,

if

= 0,

if otherwise.

We iteratively propagate the label 2 as follows:

Sl(x,y)=2,

if

((St(x , y ) = 1) a n d ( S 1 ( x ' b i , , y + j l )

=3or2)).

Any 1-pixel in S t having a 3-pixel or 2-pixel as its (it, Jt)th neighbor is converted to a 2-pixel.

SEGMENTATION OF TEXTURED DYNAMIC SCENES

247

The 3-pixels in St satisfy the following conditions: (a)

They form a connected component.

(b) They provide evidence that they might have been jointly displaced by (it, Jr) units while retaining their interspatial relationships intact (because they appeared in SMiljt). (c)

They exhibited motion (because they appeared in Rt).

Because of our definition of moving object, a 3-pixel cluster can be considered as a strong candidate for the mask of a moving object. Although 2-pixels satisfy conditions (a) and (b), they fail to satisfy (c). In addition, we converted all those 1-pixels to 2-pixels which are thought to be in the "path of motion" of 3-pixels (assuming uniform velocity for a moving surface). The 2-pixels are likely to occur in the overlapping region of a moving surface. The surface gray level characteristics are assumed to be such that they were cancelled in the difference picture in the overlapping region. 1-Pixels also satisfy conditions (a) and (b), but not (c). According to our definition of a moving object, 1-pixels also qualify to be considered as an image of the object; however, the evidence is weaker than that for 2-pixels. The •-pixels usually correspond to points from other active regions. The 0-pixels belong to the stationary background. We place 0-entries in these locations when we develop a mask for the moving object. From the varying degrees of evidence we have gathered so far, we can say that the image of the object can be no smaller than the 3-pixel cluster and no larger than the nonzero pixel cluster in S. Note, however, that this conclusion is reached strictly using the motion information obtained in stages 1 and 2. We have not used any knowledge about the gray level characteristics of the surface of the moving object, its size or shape, etc. If we are provided with additional information about the object, we can use the S image as a guide to develop the object mask. For example, suppose we are given that the moving object surface contains either a uniform gray level or uniform texture. We can then dynamically extract these characteristics in the current frame in the locations corresponding to the 3-pixel clusters in S. We can grow the object image along the 2-pixel region a n d / o r the 1-pixel region provided they contain the same gray level characteristics as those in the 3-pixel region. 4. RESULTS The following sequences of dynamic scenes have been used to illustrate the shift-match method: (1) RAND sequence: In this sequence, there are two objects in the foreground which are in motion. The objects as well as the background contain random texture. The rectangular object is closer to the observer and the circular object is behind it. They move towards each other and the circular object gets occluded. It is difficult even for a human observer to detect the boundaries of the objects just from the static analysis of individual frames because the distribution of gray values on the object surfaces and on the background is very similar (Fig. 3a). (2) TEX sequence: The background in this sequence also contains random texture. A rectangular object with a regular surface texture (grid) is moving in the

248

JAYARAMAMURTHY AND JAIN

c

FIG, 3. Test sequences (only 2 frames are shown): (a) RAND sequence, (b) PIC sequence, and (c) TAXI sequence.

SEGMENTATION OF TEXTURED DYNAMIC SCENES

249

foreground. (3) PIC sequence: A connecting rod hanging on a hook moving to the right over some stationary clutter is depicted in this sequence. The background is mostly empty and homogeneous. The rod is darker than the background and is als( homogeneous. The motion is translational in a plane (Fig. 3/9). (4) ROT sequence: A rectangular object is moving in the foreground. Th~ object motion depicted here contains translational as well as rotational components. The amount of rotation gradually increases with time. (5) TAXI sequence: A traffic scene from downtown Hamburg is shown in this sequence. We have selected a window around a moving taxi which is turning into a traffic intersection. The motion contains rotational components (Fig. 3c).

4.1. Translation The shift-match method is first tested with sequences 1-3 which depict purely translational motion. The results of the method when applied to pairs of consecutive frames from the RAND sequence is shown in Fig. 4. The active region (AR) picture shows good segmentation when the objects are well separated (Fig. 4a(i)). The match-index matrix (M][M) for each active region is shown in the form of a 3-D graph (Fig. 4b). In subsequent frames the objects move closer" together and the corresponding active regions merge into one. Two distinct peaks can be seen in the MIM graphs. We can notice the gradual change in the value of the match-index at the position of the peaks as the smaller object gets more and more occluded. The results of segmentation for frame pair 3 and 4 is shown in Fig. 5. The S images (Fig. 5a) are obtained using the directional information provided by the position of each distinct peak in the MIM (Fig. 4b('fii)). The cluster of 3-pixels in the corresponding S image forms the core of the mask for each object. The boundary points of the masks are marked in frame 4 (Fig. 5b). The segmentation results are quite reasonable considering the~/ are obtained under adverse conditions such as occlusion in a textured environment. The image of the "circular" object appears oval in Figs. 4 and 5 because of a printer artifact. A very important point to be noticed in this example is that the moving object masks are extracted without the help of any other knowledge source. This is possible when the objects and the background are textured or at least nonhomogeneous. For objects having surfaces of uniform gray level or moving against a uniform background (as is the case in the following examples) the extraction of masks for moving objects requires additional knowledge sources. The results of the analysis of the TEX sequence are shown in Fig. 6. The MIM graph (Fig. 6c) displays several sharp peaks and their spatial periodicity gives us a clue as to the nature of the surface texture. The match-index values at the peaks add up to a number larger than 100, ruling out occlusion as the cause for distinct peaks. The position of the maximum peak gives the directional component of motion which is used in developing the S image [Fig. 6d]. The object mask is developed from the 3-pixel cluster and its edges are shown as dark points in the current frame [Fig. 6e]. When the motion parameters match the integral multiples of the spatial periodicity of the regular surface texture, the overlapping region of the object image in Fp and Fc disappears from the difference picture and the A R picture contains two regions R1 and R2 (Fig. 7a). This situation also occurs at all values of motion

JAYARAMAMURTHY AND JAIN

250 a

i

........................... "'

. . . . . . . . . . . . . . . . . . . . . . . . . . .

.........................

ii

,,11111~11111 Itltftllt1111

................ .............................

Itltltllttllt1111~|

........................ ........................

........................ ........................ ........................

111

............ ...........

t1111111111111tlttl~t H~lI111111111HIIttt II11f1111111111111111 llllitfltll~lllS?flll

.................................

~

......................

...........

........

. . . . . . . . . . .

" .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

I111111111111111tlII

...........

.........................

11+11111111111111111

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . .

t1~111111+111111111

. . . . . . . . . . . . . . . . . . . . . . . . . . . .

r,+

l + l l l l l i l i l i l l l i l l l

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

llltll?llllttlltl

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

I

.................. . . . . . . . . . . . . . . .

................... ................... ...................

. . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Ill111111tiltllllllt...lltlltlll iiitttt?ttlttltttlttllltlltttllllll itllttttttttlltltlltttllttllttltlttl IItllllllt111111111111tlllttltllll14 illl(l(llt~lt~tt~ltlltlttllllllllll~ tttttttlttttlltltttttttltt+tHfttlt~ Itltl~ttttltctttttt111ttlltltttttttl|

.

.

. .

. .

.

. .

. .

.

. .

.

. .

. .

.

. .

. .

.

. .

.

.

.

. . . . . . . . . . . . . . . . . . .

...................

................... 11111

IIIIIIII111,,.lllll.t,ltlllt111111

.................. .................... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

IItll?ltllllll

....

. ..................

. . . . . . . . . . . .

. . . . . . . . . . . . . . . . . .

.

.

.

.

.

. . . . . . . . . . . . . . .

.

.

I

iIIIt1?llll+1111111111i1111111t1111+

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

...............

.

.

.

.

11

. . . . . . . . . . .

.

1111~111111111~ .

............... . . . . . . . . . . . . . . .

tl

I$

111111{

11+111

IIIIIIllffllllfltllllllllllllll+l?lt,.,

.

. . . . . . . . .

. . . . . . .

i

. . . . . . . .

. . . . . . . .

, ................ . .................

'

II11111111++1111111111111111111114+I,..

i1711111111111111ttt1111111111111ti1.+. 1111111111111Sllllllflllltlllllll~11,.. llttlltlllt?llllllllllllttltltltll Iltllllllllllllllllllltlllll111111 tltlllltt4tllttllttttltlllttttllll I ttltltlttlltltltlllltlltl~ ItttIIItt,+. II?lttttltttttttltll~ttl(ttlllltt~

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

, .................................................... . . . . . . . . . . . . . . . . . . . . . . .r. . .+.+.r.o.l .l . . . . . . l. .l . i. .l . l. .l . ~. . I. .L. .l. . .

+,,

.

+~l+Ir+illq

i

I ++~

q I ll+i+ld

lllmllllll+

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

+ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

i

ii

iii

9O BO

70

2..

I , , .

i

1

l l l l l t 1 1 1 1 1 1 1 1 1 1 1 + t 1 | 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 , , .

. . . . . . . . . . . . . . . .

, ................ . ................

11++1111,

................ ...............

Illlllllflltlllllllllll

. . . . . . . . . . . . . . . . .

, .................

II,

. . . . . . . . . . . . . . . .

,

. ................

ttltttttlltttttttttttlttttttlltttstt, i11tlllll1111f1111111Slllfll11111111, +i11111111+11111~Ii+IIItlI~

.

, . . . . . . . . . . . . .

. . . . . . . . . . . . .

.................

................. .................

11~I~111~11))|1~I~I111

. . . . . . . . . . . . . . . . . .

.

~ 1111111111111 Itlltlttttttttttl 111111111111111111 iiilillllllllltlllll li1111tfllIHt~Hl~lS 1111111111111111111111

.................

l . . . . . . . . . . . . I

II

~ ......

.

................

. . . . . . . . . . . . . . . . . . . . .

...........

I f I 11slit iiiit11)11111,, 11tllltllllllllllllttlllllllllltllt.. $11ttltlllllllllttlllltltllltltlttll, i i i i i i i i t l l l ~ t ~ l ~ l ~ 1 1 ~ l l , IIIIIIIIIIIIIIIIIIIIIIIIItllllllt111. ttlllttttttltSllt11111111t I I tlllttttfttltttpttttftllll|ltltttttt. tttttttlltlttll~tStltttltltlttltlftt+

.

.

................... .................

i11i1111tl?llltlll(11111

b

ff11111111}111 I+(I111111111

................. ................

1111tliii~i+iii~+++i~11+i+I111111111

. . . . . . . . . . . . . . . . . . . .

. .

ttttlJtttllttttttttt~tttl~lltlltlttt 1i1111111111111flll?lllllllltlllflll

................... ..................... ....................

l

.....................

, ...........

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....................

l

~

. ......................

,

........... fill

I

I

. . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

...........

........... is ........... II ...........

lllllltlllll1111~lIt~tl 1111111nllln+lll

" ........

...........

ii

11

. . . . . . . . . . . . . . .

I~111111111 llltl

. . . . . . . . . . . . . . . . . . . . .

t4111111ttl*ttlfltlllt iglolllllunluultuallltu I 1111 IIIIIIIIt~t111111111

.............

I141111111111

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

.

, . . . . . . . . . . . . . . . . . .

-

~

. . . . . . . . . . . . . . .

.......................

........................

........................

.............

...........

Itttltltllltltll$$1ll

. . . . . . . . . . . . . . . . . . . . . .

iii

v.

FIG. 4. Results with RAND sequence: (a) active region (AR) pictures, (b) match-index matrices (MIM) for frame pairs (i) 1 and 2, (ii) 2 and 3, (iii) 3 and 4, (iv) 4 and 5, (v) 5 and 6.

i itll

251

S E G M E N T A T I O N OF T E X T U R E D D Y N A M I C S C E N E S iv

V

2 2 X 2 ~ X I ~ I X 2 2 ~ Z 2 1 J 2 2 2 2 X 2 2 ~ X . 2 2 X 2 1 2 2 2 X ~ 2

~ X 2 2 1 1 X I ~ X 2 X I 2 ' ~ X X U 2 ~ : I I I I ~ Z ~ X X X : X X : :

2XIXIX221~:2~:I~:222~X~XI~X272~IX22.22X~II

221~XX211XX22111~IIXX~IIXXTZ2~X22XTXX2~

X 2 X ~ I ~ L I Z 2 1 X I 2 X X X / ~ : 2 1 1 X I X 2 1 1 X ~ I I X ' X I

: ~ X X 2 1 2 X 2 ~ I X ~ I X : 2 2 X X ~ I ~ : : X : X X ~ 2 ~ X I X X 2 2

2 ~ i i 2 1 ~ 2 . 2 ~ 2 2 2 1 1 i i i i i 2 1 / 2 L i 2 ~ i i i 2 2 X 2 1 X ~ 2 . . . . . . . . . . . . . . . . . . . . . Itltltlll . . . . . . . . . . . . . . . . . . . . . . . . . . ............... illl]ll~lllllllllllllllllltllll]t ...... .............. Iltll~lllllllllltllllllltltlltltlll ....... . . . . . . . . . IIIl It IIllllll i 1111~lll . . . . . ........... 11 I III 111 Itll I ...... .............. $11[llltllltlllltlllllltllllltllllll ...... ............. IIt~ll~$1~l~llllllllllllflifflcff ...... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . lllllltlllllllllll)$1111~(l~]lll !~]111 ...... ............ llllllllllllllll$1111lll~lllltllllllll ..... ............ IIIIIIIIIIIlllltltll/llll~lllllllllll ...... ............ IIItlllllllllllllllllllllllllll~lll ..... ............ IIIllllllllllltlllllllllllllllllllllll ...... ............ llllllll~l~llllllllllllllllllllllll ...... . . . . . . . . . . . . III$111111111TItlIIII~I~....~IIIIllI ...... ............. 1111111111111111111 ....................... .............. ............... ................ ................. ...................

I1111111111111111 ......................... IIIIIII1111111 ........................... Illltllllllll ........................... 11111111111 ............................ IIllll .............................

.

.

.

.

.

.

.

.

.

.

.

, ........... .". .................. . . ......... , ......... . ......... , ......... . . . . . . . , . . . . . .

. . . . . . .

;; i ~i;~ii~i i ilil iii tllllllllllltllllltlllltttlllllllll ........ 111111111 I I [ I I i I~1~ ~ I I1 I I . i 1~ H Ill I I ] ~ II .. i i iiiii t i II I~1 Ill I .lllll ]11$~lll~l Illl II111 1111111111111111111111111111111111111 ....... IIIIlllllllll )llllll IIllll IIII I1111111 .............. 11111111111111111111111111111111111111 111111111111111111111111111111111111111 ....... IIIIIlll$1111llllll(lllllllllllllllllll ....... IIIIIIlllllltllllllllllllllllllllllllll ....... IIIIlll[lllltlllltlllll ....................... IIIIIl~l~l~ll~llllllll ...................... I~lllll~llllllll~lt ........................ .

, .......... , . . . . . . . . . . ,

...........

, . . . . . . . . . . . .

, .............

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

,

. . . . . . . . . . .

.

,

.................

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

2

.

$1111111111111~1111 .......................... 1111111111111111111 . . . . . . . . . . . . . . . 11111111111111111 ........................... iii11111111111,,..,: ....................... iiiiiIItlllll ............................. . . . Illllllllll . . . . . . . . . . . . . . . . . . . Illflf ................................

. . . . . . . . . . .

. . . . . . . . . . .

iiiiii!ii!i!i!iiiiii!iiiiiiiii!i!!iiiiiii!iiiiiiiiiiiiii ~::XX:IX:X:X:IX:22~XX::X:::~X:X;XII:2::X~

iv

v

ro

FIo. 4.--Continued.

parameters when the object surface is homogeneous. The MIM graphs for both regions are shown in Fig. 7b. The region R2 has resulted when the moving object uncovered the background. No matches were found in this region for any shift, as is evident from its MIM, because the background contains a random texture. We cannot obtain the directional component for this region. The position of the maximum peak in the corresponding MIM graph gives the directional component for R1 which is used in developing the S image (Fig. 7c). The mask developed from the 3-pixel cluster does not contain the object image in the overlapping region (Fig. 7d). The mask that is obtained from the 3- and 2-pixel clusters, however, is able to represent the complete object image (Fig. 7e). The different picture for frame pair 1 and 2 from the TEX sequence is shown in Fig. 6a. It can be seen that the difference picture does not contain a compact region to initiate the growth of the image of the object by the methods suggested in [10]. The need for filling the difference picture to obtain the AR picture when dealing

JAYAR.AMAMURTI-[Y AND JAIN

252

....................... . . . . . . . . . . . . . . . . . . .

.................... .................... .

.

.

.

.

.

.

.

.

.

.

.

.

.

................... ................. ................. ................ ................ ................ ..................

.................. ............... ...............

3333333393933''~* .

.

!!!!ii!!!!i!!!!!!!!i!!!ii:i!!!!!!:iii!!ii!!i!!!!!!!!!!!!

• - - , 3 3 3 3 ~** ................... • 33333393****. .................. 133333333333***" • . . . . . . . . . . . . . . .................

233333333~33333 * * ' = , . . . . . . . . . . . . . . . 333333393339333~3"** . . . . 3333333333333333333"*~ . . . . . . . . . . . . . 3333333333333333333*** . . . . . . . . . . . . . . . 3333333Q3333333333333"" . . . . . . . . . . . . . . . 33333339333333333~333"'. . . . . . . . . . . . . . . 3333333333333~3339333"*, . . . . . . . . . . . . . . 33333333333333333~333o* . . . . . . . . . . . . . . ~333933333333333333~9"*. . . . . . . . . . . . . . . • ..... ,,**.*.* ...... ,.*,,,o ....... ..,, • "*o''**~'**'.oo*.*o.,,.,. ...... o,, .... .

.

iiiiiiiii!iiiiiiiiii!!i?!!ii!!i!!!ii!!iiiiiiiiiiiiiiiiii

.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::: . . . . . . . . . . . . . . . . . .

.................. .

.

.

.

.

.

.

.

.

~ ~ .*

.

.

.

• *************************************

.................

. . . . . . . . . . . . . . . . .

• , . * , * * , * * * . * o o ~ * * . , * * * * o , , . . , * * o o * o

.

.

.

.

.

.

.

.

.

.

.

.

:::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::;:::::::::::::::::::::::::

................. ................. .................

. . . . . . . . . . . . . . . . .

................ .................

, o . , , . , . o , . , ,

. . . . .

.................

...

::::::::::::::::;:::::::::::::::::::::::::::::::::::::::

EA4CAC27827BSBAg 341259BABO.771892,gB4DBE262~?217GBF29 97589E2432BBDEIEATFECFQAGOCIJDSC 5FBCCD59CFC4.BIBSEC02^ D~48D~2¢34~A.~FS~2~321S2~t~SDSE~D2~3C/,I¢~h~h236~

.

.

.

.

.

.

.

o,33~***

. . . . . . . . . . . .

o * ***

* o,333

,

. . . . . . . . . . . . . .

. . . . . . . . . . . . . . .

33333333333333333;333333333333333333 . . . . 3333333303333~93333339333330,'33,3~333 , 33333333333333333333333333333333333",,, 3332333333333333333333333:]:]33333333", 33333333333333333333333~39333333333',.. 33333333333333333333333333333333333*, *. 3333333333333~333933333333909333339.,,, 333333333333333333333333333D3333333 * . . , 333333;]33333333933333333~3Q~33333333, ,, 3333333333333~3333339Q33333~933333332. •. 33333333333333333333333~]233~33333333., • 333333333333333333333339.~393333333Q" ,

EA4C^C27R~78~SA9.,34125DOA89.771892.904DB£~G26721768F.29 f17589E2432flBDEI~ATFECFOA6OCIIOOC.SF~COS~CFC4,B]6~ECI)~A DkD~8~92C7 49tSA.DFSF267219~BtBBDBCTD~93,C74C6&4~A23~9, E95482E5EESdE664207EIC39,4AaTIBCI25B?8~AF4FDT3B37.FESFOB 56hAIIED27B,4gE97CBTf147GA61,5261F59E262S538.,.FDfTAF1a3E 76B944BC999648BCGAB5EA4F6,DE,DB478F72)~EFF4,E75B97FDgOf~

E954B2ESEE54EGG4207EIC39,4AS?I~CI258785AF,IFO?3837,FESFO~

56AAlIED27fl.49E~TC~78476A~I.52KIF59E26~9538..r~g?AFI83E 768944BCgSg648BCG~BSEA4F6.D6.D~478F7216£FF4E75897FD9~19

CDAS~G~AD3F~A~E~F3B34, ,lGT63ED431,3AF33874AB 924C25E3CC 9FS,,FI.94C5339.DOECATI6F63C22AA4.tglGCAA?D~B7BI72FAB9 GFRDABE77EIDBAEA259A22BGI~CA2A?55nd8249F~CgACF.B3DD93t 8tCF3~E83ECgF5E605EgBt55t~822153897AA738a4505B~I3033A294 B92~4E~B~DBBAAgA.343854A.RT?C~EA~IB~41AF?231E2AIO5733 BC6CdAEADG59CE57A.SSESF4.~ADSBB733FFr2g83BB~F92E~87FATr BCSEO2D,OIA.TISA24783~5~FF141718BS,E.B2799834E22CAOAA GFaSgFDAA481CEBA295FAB935GGF?2DFDGFAGBOBE?F~C583B75123 A705~F337FAgE2E57CCBA.SA.TBS.~445B75493766CB740FIIA~C 08F.FCOglA562~311BE2B76B~937 ~C263tDS~g77FO27FTgG 4795.9963DSEBeF16CS5~.SC1 .E45A.C44461DB.AO64FCH 6BGS42~dGSfl154~gSA4tCD 2FASGSBR3~35Ftd 1~95F AD~45F.C74EECg,FCIFES, 74F49 2tE4C54E4Ag2BF 7CD4ABGSFATdOOg7AE8928 BFgEgD 0112174A5874 27CgDFSB952155~TE849 122FAS~O1107BSBSOFE 34ES~SA4.7.51742617~gE4.~gCAs6^gB2AE .SECE74~DE,ECgAEI73 5E5970334Fg47F79C5 .EB.C..SDCIC411tEE 8.?^~3.B.2F.6ASB3 9473~BC3AF3C~SCE&~ 39FBE6822E67848DA 3B7A3A57E75FOACA~ 5A7FglB&C21B62~A 353~B.flB358278~BE5 9EB.dg.SDB17174 1889E3~85995.22535 6~FSO2CA.gDCSGTCA 36.BQDTBEID75fBE29FFFCT~.AD.CA~DSE.ABF.FgFA~CCFEBC85462B FSBA39928FFE2S.gBF~9..D,~gEECFDCD.FO~ACSCA,AFTCDBCBD47A CTCAT.SIFIFtAF2DCgC.C88E~ABFFCO65ECO7CSCAGE~SACEF~?.3 6 2.SETCBEDSES£A3D3CBFBgSFBTA.BS.E£CDA5B.CEEFESA.8C~..631 6619CSEFD4~gB.S.FEAB6B9BABCEgFgDA?FSG65DD.B~970FCFFACE6~ FACEEAG.B553ST~6F.Fg~OE~SgG75CSgCgE.~SC&gECCSE6.0E~6OFS 32FT.|BBAC4CSICIC.DFBAEBCEG 9.F~DDCDCDDCC.TB.FBDFgCDG32 A,BIF3CIF31DSFFtSAGFE~gDCBECgDBEg.

.

.....

. . . . . . . . . . . . . . . . .

. . . . . . .

.

.............. .............

:::::::::::::::::::::::::::::::::::::::::::::::::::::::'

• * * . , . , o . o o

• . . , o o * * o ,

::::::::::::::::::::::::::::::::::::::::::::::::::::::::

COASBG2ADOF2AAEQEOB34,,aG763ED431r3AF3,Q874AB,D24C25~3CC gFB,,FtgdC5339.O3ECATI6F63C22AA4,191GC,AATQE,B?~ITgFAB9

6FSDAB677EIO~AEA25~A22BGI3CA:A75594824~F,gCgACF,B3,0~931 81CF32E83ECgF5ETO~EgBI5517822153BgTAA73g445058¢I3933A294 89284E7S7698BAAgA.247854A,B77CFEAT,18~41AF7231E2AID5733 BC6C4AEADG59CESTA,B568F42AD588733FF.2gB3Ba6F92E68.TFAT,

BCSED.2D,DIA?I5A~4783A559FF141?IBBB,E.B279983,4E22CADAA 6Ffl.69FDAA46fCESA295FABg3566F?2DFDGFA6B~BEIF98C~83975123

ATDSAF337FAGE2E57CCSArBrA.TBS,Q4458754~3?,EE,C8749FIIA~C 08F,FCDgIA562E311BE287GgE937CEB6582C2631DSFg?TF.O27FTB6

47059963DSEBSFIGC552.gCIE5A368FC,645A,C44461Da.AD~4FCIt 68654264G58154295A41C04CBABC95AS52FA565863035F,14,1295F AD445r.C74EEC9.FCtFES.DECAIAB52BF74F49.2t,E4C54EJAg:BF 7CDaAB65FA74DD97AESg2BA945F.2BD4COSBFg. EgD.OII2174A5874 27CgDFSB952155.TEBaDIG2AIACA.E.A39BBAI~FAS~DIIO785BSDFE 34E565A4.7.St?426479.92LES.CSD3CFgFBAGgE4..BBCA56AgB2AE 5ECE74BDE.ECgAEI73CT.4668DgSEB49.99.DSESg7DO34F947F79C5

.EB.C.BDCIC4111EEBSgDFCgTA3D3841ESFB998.7A23.B.2F.SA5B3 9473B8C3AF3CGSCEAG59BBEED.967C,D66691EA39FBE6822E67B48BA 3B?A3AS?E75FOACA69 0 . g C ~ E C 3 , C 6 9 6 9 9 4 E 5 3 F S ~ . ~ F g t ~ A ~ 6 ~ 3536B.gB3582782BE54B..~CFAAgGF~B.31.FSOgEB.ag.sDE.1717.4

188BE35859BS,2253599,75F ED.B.EBE.EI 502CA.gDCS67C~ 36.B307EEIOTSIBE2-~2~V-. 4G2B F58A39028FF628.gBIOg..D.BgEECF.DCO.FOBACSCA.AFTC08C D47A CTCA?51FIFt4F~DC C.C88EAABFFCDGSEC87CBCASEBSACEFB7 .3.S 25ETCSEDBEBBA303 .BFBgGFBTh88.BECD~SB.CEEFEBA.EC~ ~G31 6619C6EFD4AgB.S,F AB6BgS~BCEgFeD~TFSS6~DD,BD97DFCFF CE6B F~CEE~C.55538398F FgADEa6SgSqSC99CgE.DBC~gECCSES.DE 60F6 32F.7.1BBAC4CSICI . D F B A E B C E f . 9 . F A D D C D C D D C C . T B . F B D F 9 0832 A.SIF3CIF3105FFI5 6FEBgDCSECgDSEg,FOF,BEFDgC.OgBCEC 297. AA25EC93F.2BFC3~8 B.7..9EO.AADE.BgE.gFCF~ECS.F6FOA.. 921 • 28ED58752F~6.,E9 B.gOg. BGESEAD.B,gEA95,BAF6BADB?FS9 796 .D6FSB56DTB373717 B/ 9 . 99ED • FE . SDEE . E . FEF~CCFgDTEFg , 8 71 , 2 F S C E F E A I , D B F A 8 8 1 - = = = ~ ~ 8 1 F 88~615.ABDC4ABC?5,65FCS?I847568C4E3B52DJIII~A.~A4~AOBED5 96EI?2A.7EGDS~C41715A.E3GAD31231II93484DFA296309EICI.~C2 A2S.fl 94E~BSCID355EC~8.AD74CGSCIF4.E59A889AASg.27.EGBE~8 25E6DAA.CggAOS.AF425351EB69AS6Dg4B2C3FOBE.TCEB48,B479C86 82.SEABDBOIC3~TDE2FC387.2.4CAF56AC.63CF3EgE4B81,E|EB6,F 3CDISBA2G.2AC.41,gABC57,4BOF7102BSAF6F2D531~74EI74A72DI3 ASEBg.aA71C247954~DDC6F4.34DBFOC.6BCO~.EOEOggE4B2C£4.2

FDF.BEFDgC.DgBCECB297

AA25EC93F,2BFC3BBCB,7,~ED.A~OE.BgE.gFCFAECg. FGFO&,S921 A~SEDSB752FD6. EgASSD9 BgESEAD,B.gEA95.B4F~BADBTFSgA?96 .OGFSBSSDTS379717EB.9.99ED.FE.~DEEE.FEFgCCFgD?EFg.SD?I. 2FBCEFE~I,DBF~98f. EFgBCC99,gFFEAEBSCESECAFOE~EEC.EAF281F

889615.~BDC4ASC75.GSFCS?IB47568CdE3~52DIttIBAr6A42ADSEO5 9S~2~,3£S05SC4~3~SA.£3~hO3123~345~FA29G~D99tCI ~C2 ~20.B.94E285CID355ECBS.ADTdC~BCIFdE59ASBgA~ag.2?EESE38 25E6DAA.CggAO3.AF425351E963A86094B2C3FDBE.TCEB48B479C86 82,SEA~DB31C357DE2FC387.2.,4CAFSGAC.G3CF3EgE4081EIEBgF

~CDtSa~6.2AC.~I.BABCS?.4B3FTIO2~SAFGF30531F74EI74A?2DI3 ABEBg.4A?IC247984G~OOCGF4.O4DBFOCGBCD.D.EDE389E492CE42

2A.B358984E3716,856CTFBF479g~3.dDED61718CID95EAC4CCgFC4E 25FAD3236AOD34E4B.ESCtBF89C4098263h307385AI.~CI.G5CS?827 ~ESaE~E~2AD4S290~9 ~3C21~49.A~E38S965SG991E72FA253S~FF 77FBF.,FED32GC4A,TBS737367B,49BB34tEF388F3CD6EFDDC?D.16 9~BAgA.6EA3EKDEI3BEEDF6,3F445~E3~A73Cg~SBtFSE62F34DCIEE

2A.83BRD84E371685~CTFBF479923,4DEOfI?18CIO95EAC4CCgFC4E 25FAU3236A6034E4B.ESCISFaaC40982E3~3D?385AI.ACI.ESC87827

2E54EI,692AO4620DG3,B3C21A4D*A26E386965§GgS1ET~FA25369FF 77FSF,,FED33GC4,A.7~6737357B.48BB~41EF38BF3CDEEFODCTD, IE 968AgA.GEA3EGDGt3BEEDF6..3F4452E3EA?3CS&SBIF6E62F34DCIEE EECD4RC539AC2E8554B6.DAEBg2FESBDE 3A3663FE.S36B.Dl|14CB~ C263A444AA7Da6EI.5401251G,B434C414DF571ESE~B~3.CT.SE55E,

EECD48C539AC2E8554~6.0AEB92FEBBDE.3A36a3FE,g36B.DI114CBG

C263A444AA7046EI.~dDI251G.B434CdJ4DF571ESG28~3.C75ESSE. A83FgB3CCEgFBg?A~AAE,tEA~B99DE1473719AID4SDFB7CSF6EIB~9 1D45FgG43t22FG58114DCS~8.B445CFECI6264BABASa33986C?2DD

AB3F~83CCEgFBSTAIAAE,IEA4.~ggOEI47371~AIO45DFBTCBF6EIBD~

ID45Fg643122Ffi58114DCSBS.S445CFECI62648ABAS~3.3986C7200

b Fro. 5. Segmentation results for frame pair 3 and 4 from RAND sequence: (a) S t and $2 images, (b) frame 4 with object boundaries marked.

SEGMENTATION OF TEXTURED DYNAMIC SCENES

................... . . . . . . . . . . . . . . . . . .

................ .................. .......... ................... ................... .................... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

...................

1.11.11,11,11.11.|1.11.1$,|1.1|.111 f,lf,lf,lf.ll,ll, lt,ll.f|,[I.II.l|.. 11t,ll.ll,ll,|l,ll.11.||.|l,|l.l|.l| |1,11.11,t1.11,1|.11.||,11,11,1t*111 1,11.|1,11,11,11,11,||,11.11,11,11,1 |1.11111.11,11,11,11.1t111,11,11,|1 I,|1,|1.11,11,11.11.11.11.11.11.|11 I1.11,11.$1.11.11.11.11.11.11.11.1 ~l~,~,||,|l.~t.|l.||.|l.ll,|l*l|,ll I,|l,ll,ll,ll,|t.ll,l|,l|,l|,lt.t|, 1.11.11.11.1|.11,|1,11,11,11,11,t1,1 I1.1 ..... Illl||111ll*|,||ll|l,|||l*

It

................... .................... .................... .................... .................... ..... ...... . . . . . . . . . . . . . . . . . . . . .

..................... .................... .................... .................... .....................

253

11111111111111111111111111111111111. 111111111111111111111111111111111111 [lllll[l~lrllllllllllllllll~lltl|lll |lil||llll|lllll|lllllllllllllllllt| |||11||1|1|111111111111111|111t111tl 117111111111111111||1111111111111111 111111111111|111||1111111111111117| I~ll|111111111111111111111tll11111! I~llll|l|llllll|llllll|l|llllll[lll I1t111111|11|11|111111111|1|111|1111 II11|ll|lllt|1111|ll11111|l|ltl|lll| llll|lllllll||l|l|lllll~l|l|llllllil I1|1 ..... t111111111111111111$11111.

b

~_5 ~,

C FIG. 6. Results with TEN sequence (for frame pair 1 and 2): (a) difference picture, (b) AR picture, (c) MIM graph, (d) S image, and (e) frame 2 with object boundary marked.

with textured scenes is evident from this example. The AR picture obtained after filling the difference picture is shown in Fig. 6b. Next a pair of frames from the PIC sequence are selected for analysis and the results are shown in Fig. 8. A single dominant peak is observed in the MIM and the corresponding S image is computed [Figs. 8a and b]. It is possible to come up with several candidates for the image of a "moving object" [18], all of which can be developed from the S image. Any mask that is developed with the 3-pixel cluster as a core, and to which any number of 2- or 1-pixels are added while preserving connectedness, may be considered as a candidate for the image of the "moving object." We show two masks extracted from the S image which used as a core (a) the 3-pixel cluster and (b) the 3- and 2-pixel clusters (Fig. 8d). The first mask interprets the entire object image including the "hole", along with the part of the background

254

JAYARAMAM~RTHY AND JAIN 6E,FGT|COSFC39,C,2£825E95ETQAg~DDB23T.2,FD~AGO7TB74DF7~8 7D3~AgB37CEGBCSGC4AOSB97ABIF619~557D.3F2e.BF~D68C54.CO2e

97.2CGSEAOAgDOBFSB751.39B..21GTGT.E42ES.4DC.BFOGGFg. IBBD S2FBS.B5=~FSSBSCDE3CCEA2FC?D3ESSD65C75D31EGED1385AI92DSF 7&G4ACF3..EgEs]43FEEB~F2B29B31~..4CgCAF6DASE~274CCDB3CIE

60ASOBo2E4B~CA.DCSGB931DS.T995E696.FC.536D3Ff25FCE47634C DE31C.4BB5BA38BCBSA43F233AEAO2CT49g2~kG2aAAI~FBFC30~FF3E 53BA4FSaECS~IFF?B2SF9127AFG25ETISBCGBAC.~.5.G74AO44a39DD 8~.AtAA~A=SBSOS4E2ES.22AF.CSA.m|.BIF24F3.flFEFAETIOG25C33 .C.4C40671ATSlSgOADIAE=FgFgE4F966DE.2BT§2311A74AA54CB4GF

5 2 . D d l S C B . I B A A I D S F t . B . 4 B D B I 2 g l g O 2 7 A E 7 g E B 4 2 8 2 8 0 4 A A . D g . B6A 3.6B22CCEg67679386kDT~TSt?552FgSF3CGDACD4~D.egBDtgAI.043 218AgBB354=A869482?411~AA29660OATEgDFSg..?CSCD4923.aE22E C 4 E S t E g B m l I d . OE47B~2ESSBOEFEOSBC2497AEEBSFEC.SD2F55GS7E?

B4EEAFFSATD~DC695341.A2.2CA3Ea77A22DBFAglCEDGfI.BDS~G7AF 6.FS~TBSFIEFEBgEI~CC25CCCF=BEAAGC25D43~TESD.Fg.~307~ C3EAESBETETEGECEgEF444Eg2FS~F2Bg~CCEOC53SBSA512185E55357 3BC54FClECA42C4~2E~.DF30~7.?D.OESISCCFI278.AC37FIF.FE4~A 456C83BgBlaI251918CEB64493844gFC~29808.CT|4444AgE3|39C75 BCBG.2~g|7|45A|IB~Ag.dSFJGIGC368.C53BBCA3~C95495.5.F24~g

.F2~H.B2OTA~Cq928E.DISA.IFFGCGG2481139C56F2BII.GgCgE49G? CAIBE.F8DA~SFFI6~D6~AIB~.dBC8569DS~.?~7.G.]GABgAB~C2433 ~BF.S4FSSTE~|EAB.7..4BIBAA3D4|T48FTTEGGGFf1259~TDSHA52.50 5D351.ESCgEIF3A2807GOdAA.OF4A758BDEGSBCAE.22AC.ABOF45AD6

303FCGCa3144FEg.14~G.COGFA99718.G318A74EO559935GFF~71BGG . . . . . . . . . . . . . . . . . . .

.................... .................... .................... .................... .................... .................... ....................

.................... ....................

.................... .....................

33333333333333333333333333~33333333" 333~33~3333333333333333333~33333333" 33333333333333333333333333333333333* 333333333333333~33~333333~33333333" 333333333333333333333333~3333333~33" 133333333333333333333333333~3333333" 13333333333333333333333333333333333* t3333333333~33333333333333~33333333 ~ 3333333~333333333333333333333333333" 933333~3333333333333333333333333333" 3333333333333333333333333333333~333" .,-%....,,o*-,,,,~,,,°oo,,,-°**o%

tAEI.984~OOGICS40.CA ~F3,SCBGg|3A3EO4A7BE OBFF42B2COFDS26~DEgF

7

813EA22ETE94DBISAED4 I EE85OIS27G87,E~F71GI 5 C97C.FgaAAE.IEB2FT~e8 8 OFTT~F49FIEOaSI5553A. £ 88.A122E355FI3gCAFgo. A,ESBC§34.DGFSFEFC4. 4 A525CIG.Fgglg..I~F3F 9 33&SEIQ.1901139&ID4~ J D740E2=COAASED6.aCTD59FCF292.CEA234.gA|69DDI2AgFAm~F. 3E?F=§FF443~SC39A.~4.AE67CB6BI4453.DE.523|A586C~.4~..G56 F 4 9 . 3 E C 2 . t296.EgBESTCC.4DDECF.B2TCf1798139AFIfiSEEB3EA4FflB

98248E2FESaAF7365EG.CAS&57~E4293.0C=GGF77E~59~§2ET(5~O.. CSaFOIg$|4.E4.FDCCFE3FFAGAT~:O2462B||CGS~?8|735.g62c3077 |g14~4B.lO354AE7.GIBGEFC57FIF6A418794EF.TE85~92OF49C?SF3 3.47AEEEEBa. FED34009763ET~82496.BB.OOE363AT.4.AEIGDF7495

88aOTOOF486F3EaSeT,463F4325EGa65~AglG.GBE5324Aag,,3~846F DC3072.33CFI2502FSO.§FIE~26C3CD34.agBSF279?CABAEF933C.52 BBCE4,FC2.TEA§TIA56A99B,C,34DBITfiA587186HDDCSIC4E4AGBgA2 C.AIB~34EEAEC74TB37FSCDAGDAECBaE~SE477EGTn,45BB..7OG.72F

683GTS~66E4ABI.A2|AaC~8FF8F295175~IF6AASABC2933TGGBISn.C 83E.31EBTOF7937.370433D.D~48CF7.S4D[FCSI.7902~II=~=BE.4~ AA24C~SCEBSO3~TFFGFC2FBFT|D7O..~|24EI73G?34E913GOCG~A~ 48460B44~4=B6FgCEg. EDESF~.EgFEdG724GFBG4F44EE3122~GTdIE2

.4081F3076FIFTIFCECS~C~II23F~4.CGCG~O3F38BF~O49..TA.TF aIl.§.E144eg.3GBII3ES~ATD74F~.8~57FkSAFE.A9~731.C692.186 6Fg.324EECT373~BG,539Ea2Bf432.27~554254964.3FGACDGE4512T 36AESTBF336F45CEFEE|7826..495ATF~A3AEICC3EAEA~5OE4Fg~955

d

e

FIG. 6.--Continued.

occupied by the object in the previous frame, as a "moving object." The second mask adds more background to the image of the "moving object." If the core includes the 1-pixel cluster also, then the entire frame may be seen as being jointly displaced, except for the static regions indicated by the 0-pixels in the S image. We can justify the multitude of candidates in this case as follows: When an object with uniform gray level is moving against a homogeneous background (as is the case in the PIC sequence), it is hard to determine the presence or absence of relative motion between the object and any part of the background just by observing a pair of frames. That is to say, it is difficult to determine whether or not any part of the background is moving with the object at the same velocity. We readily agree that the "hole" in the connecting rod is to be considered as a part of the moving object and should be included as a part of the mask. Using motion as cue, we cannot, however, say with confidence whether the "hole" is moving with the object, the background seen through the "hole" is in motion, or the object surface contains a region with similar gray level characteristics as that of the background. This ambiguity cannot be resolved without additional information and this is precisely the situation that is reflected in our results. We require additional knowledge that helps us choose one mask as a more likely candidate than the others. Otherwise, we retain the complete information obtained from this analysis and proceed to consider subsequent frames. We hope that the number of choices may be reduced with further analysis.

255

SEGMENTATION OF TEXTURED DYNAMIC SCENES

iiiii!iiiiii)ii;iiiii;;;ii};;;i;;:;;ii;iii;i;iiiii!iiii

............. .............

.............

111 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . .

II1 . . . . . . . . . . . . . . . . . . . . . . . . . 111 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

.

.

.

.

.

.

.

.

r+++ +

~

A

+ + + +"

l + 1 1 + J l l f f + l l f i f f l f l l + l f l l f l f t l ( l f l ...... ' ~111111111111111111~111111111111111 ...... ttl ................................ I11...

. . . . . . . . . . . . .

111 .

.

. . . . . . . . . . . . .

..............

trill . . . . . . . .

lllii It..+

. . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11+...

Ill...

++I

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

ill..+

+11

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

tlt...

11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . I11,.. 111111111111111111~1111111~1t111111+, I1111tllltlllllt1111111111111111111.,.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..........

lll+++Iflt+Ifll/llllllllll+++++l+ll...

+ii+++++ ...........

a GG.FG?ICD5FC39.C.2EB25695ETBAg~DOS2~7.2,FDOAEO?TB?4OF738 TD31AgB3TCEeeCB6C4AD5697ABtFGI23557D.3F26.eFEOSBC~4,C02~ 97,2CG5EAOAgDOBFSBTSI.39e.,21G?CT.Gd3E§.40C,SFO66Fg. IBSD 52FSE.9§31F§gBSCDE3CCEA2FCTD3EBGDGSC75D31EEEOI3BBAIS2~OF 7A54ACF3..69E5343PEE~IF2~2983tg,,4CgCAFGDASB5274CCB83CIE 60AgQB+2EnBSCA~DCSGa~3(OS.?9~SE696,FC,53SB3FIgSFCB47634C DG31C.4BBSSA3BBCBSA43F233A6AD2C749~22AE28kkI2FBFC3D3FF3B

53~AqF586CETIFF?828Fm~97AFG25f~71aSCEBAC.~.5.g74AO44g~t;IC~ 81+AIAA3A25BSO94E2E5,22AF.CQA.51.BtF24F3.SFOFAETIO4~2SC33 .C.4CdDETIATGt590AOtkE2FBFgG4FmGED~.2BTB231|A74AA54C84GF 52.0415C9. IEAAID§FI.B.4BOBI29~SD27AET2EB42826D4AA.62.B64

3.GS~2CCEgG767939GAD7375J7552F95F3CBJD~CD4BD.99BDI3AJ,C43 2tOA2SB3542AS894a~741tOAA2966~DA1EflO~B~.TCBCD4923.4E22E C4E51E999Bt4.gE4765~ESGBDEF[DSBC2497&EEB~FEC.SD2F~EEBTE7 B4EE*FFBA7069CE953dI+A2.2CA3EBTTA22D~F491C[BGt|.aO958TAF 8.F98.T85F;EFEBgS13ECC25CCCF2aEAAEC2~O432785O,Fg,IB83070 .................................

1,1 . . . . . . . . . . . . . . . . . . . . tl! . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

.

.

.

.

.

.

.

.

.

.

45GC~BSB141251glsc~B644938449FC42SB~a.CTI4444&gE3|3gClS 5CBS,~DR~Ti45AEISBAS.4SF4E~EC3~8.CS3'BBCA3BCgB495,5,F2409

33322:~2222222222~222~222~2222222222 *. ,,..

. . . . . . . . . . . . . . . .

C36A~SBBT~TEGEC698F444692FBBF2B2BCCB©C53565ASI:teSESS357 36CSdFCIECAd2C432EB,DF3DST.TD.DESISCCFI278,AC3TFIF.FE45A . . . .

.

.

~J3~222222~222222222~22222229~222,**,,

...............

+33322~22~22222222222~222:~22222:+2222. * * . . 3 3 3 mm~~mmmmm:2 2 m222 ~ 2 2 2 xm~ 2 ~2+ = m~m+ " " k .

...............

33322~2222222222222~2~222222222222, '

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33322322222222222222~:22223922m2222*,, imm223222222222+2222~2222+m=mmmm22~,

. . . . . . . . . . . . . . .

+:+::++-::::: ::++::::::::::::::::::::::::::: ..................

• *'*,,oo""***."'**++***'''"+~

*..

• •.

**..

:::1:'.:'.::+ .~...

!iliiiiiiiii?iiil OFT?~FlgFIED2B~555312~t,52.AlC663E|STSO41C|BBEET6A~T~34E

9S+Xt22£355Ft39CAFgC+2?gB,4754.gD.B~CE78=D~523fF3BF,6C.E~ A .GBEC534. DGF§FEFC4... t SOS~B ~ C T F C ~

~89FCS52 tamed .O 144

A525CIG,Fq219..IGF3P.F~3F23S594EF.FC30.F2211.FBEBT~[D,Fg 33AS~IO.IGOt13gAIO446F34EFOAGC57311BF§O20{DSS?2983E~I~BI DT4DE22CgAASGD6.~CTD59FCFgg=.CEA234.9AIBgSO~gAgFAB9ABBF+

3E?F25FF44308C3gA.g4.AEBT~BGBI44§3,0[.5231A~B6~T+4E,,~6 Fdg.3[C2.1296,EgBEg?CC.4DOECF,B2TCB798139A¥1696683EA4FB8 9824BE2FBB3AF73BSBG.CASB57CE4293,OC266FTTEE§B3~2[T|sg~,,

C58F~AglI4.E4,FBCCF~3FFA6A?~2D246281tC68076tT35.942C3877 19~43dG.|O354A(?.glS~SFC57F|r~A~aTg4[F.7E~§B828F42~?~F3 2.47~EEGSBE.FSO34OD9783ETEE2496.DO.DQE383AT.4.AE~EOFT49S EO8DTODF488F3E8587.4B3F4325EBB654Ag|S.GBE5324ABg..33S46F DC3872.33CFI25D2FOS,SFIEO28C3CD34.B985F2797CASAEFg]3C.62

B~C~4.FC2.TGA~TIASG~99B.C,340BI?~A5871885~OCSIC4£4AB83A2 C.AI~B34EEAEC~4;B3TFACOASOAECBBE3SG47T[STB.4588,,TEE.?~F

Bg367556GE4AGI.A28ASCISFFBF2~St?~F1FGAAGAGC2933TGGBISB.C 93~+31E579F?93?+37D4330.O4aSCFT.540EFC51.T9828112628E,4A

A424C25CE6583ETFFSFCgFBF71OT~.,SOI=4~IT38734E813~OCECAE6 4~45~44A42~EFBCG9.~ESFI.EgF[467246FE94F446E3f22~674(E2 .4DStF3DTSFtFTIFC6C~TCAIIt23FI4.CEC6EO3F368FtTD49,.TA+TF 811,~.~14489.366113ES3A?D?dF3.615TFASAFE.AgE731.C~92.166

~Fg.324BECT3735SE,539E8261~3~?TS~25~gK49F6ACOG~4B127 3~AEn75F336F45CSFG[~TB25,,495A7FEA3A~|CC2E*E~FSDE4FgE955

c

d

FIG. 7. Results with TBX sequence (for flame pak 3 and 4): (a) AR picture, (b) MIM graphs, (c) St image, (d) frame 4 with object boundary marked (object mask developed from 3-pixel eluste0, and (e) Frame 4 with object boundary marked (object mask developed from 3- and 2-pixel clusters).

256

JAYARAMAMURTHY AND IAIN GG.FGTIC~SFC39.C.2EB25RBSETBA920DB237.2.FDBAED77B74DF738 7D31Ag~3?CE6BCBGC4ADSB97ABIFEI235~7D.3F2B.BFE~RSC54.CD~E

gT.2C65EADAgDOBFSB75h39e..2167C7.642ES.4DC.BPDGGFg.IBSD 5:FSB.~531FSBBSCDE~CCEA~FCTO3ESGD~SC?SD31EGED~38BAI92DBF ?A54ACF3..EgE5343FEEBIF2B29B31g..ACgCAF~DA585~T4CCBE3C|E

60ABOB.=E4B5CA.DCSEB931DS.?ggSE696,FC.B36B3FI25PC847634C OE31C.4BESBA3BBCBSA43F233AGAD2C?49922A628AAI2FBFC3D3FF3B 53BA4F58ECnTIFF78~BFgI27AF625E71BBCEBAC.l.5.674AD44B3~DD El AIAA3A25BSDS4E2E5.22&F.CgA.51.B|F=4F3.SFBFAE?1DS25C33 .C.4C40671A75159OADIAE2FSFSE4FgEEDE.2aTB231,A74AA54C848F

~2.D415Cg. IB~AIO~FI.B.4BOEt~9~SD27A~7~E~4=E2E~4AA.E=.E54 3.G822CCE967679386AD?3751?S52F95F3~flDAC~4BD.gSBDI3AI.C43 =IflA28B3542ABB94~2741IDAA2S~60OA7EgDFBS..TCBCD4923.4E22E C4ESIESggBId.gE47B52ESEBDEFEOfiBC~497AEE55FEC.5~2FSBEB?[7 B4EEAFFBA?D~DCSS5341,A2.2CA3EB7TA22DBF491CEBEIt.SD9557AF B.F~B.TBSFIEFEB9613ECC2~CCCF2BGAAEC~D439785D.Fg,IBB3079 C36ABSBB7ET~EECBgBF4448fl~FBBF~2BCC~DC53585~512165E~5357 45SCS35gB1~I25191~CE~64493844~FC4298DS.C714444AgEGI3flC75 5C~S.~D517145AEISBAg.4BF4SlEC368.C~3~BCA3EC~E495.5.F2409 CAtfla.FBOAgFSFF!B,,B,.~..a..~..B..B..B.,B..B..B DC2433 BEF.54F59?ESI~A ,B..B.,B..B..B,.B,.B.~B..B,,B..B. A52.SD 503~1.EscgEIF3A BBBEBBEBBB~BBBBBBflBBBBBBBBBBBBBBB F45ADB 3~3FCZC2~E44FEg , ~ . . E . . B , . B . , E . . E , , B , , E . . ~ . . E . . E , ~TI~EE IAE1.984ODOCIC5 IBI.~ItBI,~IIBI.B,tBIIEm.BIIBI.B. A7TE.? 6F3.gCSGEI3A3EB BBBBBBBBBBBBB~BBBBBBBBBBBBB~BBBBB D2D~G. ~grF42B2CDFO~26 , B . , B . , B , , B , , B . . f l . . ~ . , B , , B . . B , . B , 2366.. .~2~BDIIEgEBA74 , B . , B . . B , . B . . B . , B . * ~ . . B , , B . . ~ , , B . B7=IDA 81~EA22ETE94DBT flBBBBBB~fl~flBBB~BBBBBBBBBBBEB~flBBB DBOA31 EBBSOI5276B7.EA . B . . B . . B , . B , . B , . B . . B . , B . , B . , B . , B . . 3 4 B 3 5 OFTTDF4~FIEO2BI~3A2BI.52rA4¢EB3EISTSD4BCIBBEET~AE7734E 98.AI22E3fi§FI39CAF~D279B.47~4.flQ.BSCBTB2SA§231P39F.EC.EE A.E~BC534.0~F~FEFC4...16Dg?BSC?F~BOB~fB~FCS~21BgCO.DI44 AS~SCI~.F~tg.,IEF3F,FA3F235594EF.FC3D,F2211,FgEBTIED.F9

33A~BID.~gDII39AID446F34EFDA6C~7311BF502DED55?29B~BEIABB

D74PE2=CgAASGOG.gCTDSgFCF292.CEA234.gA~BflBDI2AgFA59A~BF,

3E7F25F~443OBC39A.94,AEBTCBEBI4459.DE.S23~A~BBCT.4B,.65~

F49.~EC2.Ja~.E~BEgTCC.40DECF.B27CBT981~AFIEg66B3EA4F~B 9B24BE2FB83AFT3BSBE.CASBSTCE4293.DC~EEF77EG~B352ETI59D..

C~BFBAglI~.E4.FECCFB3FFAEAT~2~245281tCEBOTBIT~5.962C~B?7 ~SI434~.IO3~4AE~.G~SGSFC~F~FE~4~6~4£F.~EE~BS~F4~C~SF3 2.47AEEGBBS.F~D34OO97B3ETEB24S6.B~,DDE3a3A?.d,AE|EOF?4~ 8B~O?DOF4BSF3EB§BT.4~F4325EBBB~4AglG.EBE5324ABg..3384EF ~C3~72,33CFt~D~FB~.SFIED~BC3CO34.89B5F2797CASAEF933C,E~ BBCE4.FC~.TEABTIASGAg~B.C.34DBITBASBTIBBBDDC~C4E4ABB3A2

C,AI~B34E~AEC74~3?FACDA6~A~C~BE3~6477E67B,45BB,.TB6,72F B~36755GG~4A~I.A2BABCISFFBF29§fT~FIFGAABABC2~937E6BJS~.C

9~E.3i~gF~93~,37~a~3~.O44SCFT.54~EFCSI.~9~2~I~a~2~E.4A AA24C25CE~flB3~T~F6FC2FBF7107G..SOI24EITflBY~4~BI3EDC6CAEB 4Bd6DB44A~2B6FBCG~.EDESFI.B~FE467~dSFBgA~d4EE3122~ET41E2 .4DSIF307~FIF71FCGC57CAII~FI4.C6C~E~3F3BBFI~D49..~A.TF

BII.5.BI44Bg.36~Jt3E53ATO74F3.BI~?FASAFE.A9ET~I.CBS2.I8~ 6Fg.324BEC73735BE.53gE~2BI432.27E~54254964.3FEAC~6E4BI27

3BAES?SF336F45CBFEEITa25..49~ATFEA3AEJCC2EAEAFSDE4Fg£g55

Fro. 7.--Continued.

4.2. Rotation

Frames 1 and 2 of the ROT sequence are used as an Fp, Fc pair and the MIM obtained is shown in Fig. 9. There are no sharp peaks in MIM, but the maximum is wall defined. The position of the maximum is taken as the estimate of the translational component of motion and the corresponding S image is developed. Using frame 1 as a reference, we compute the degree of translational mismatch -r for a few successive frames. The "r plot obtained is shown as plot 1 in Fig. 10d. We obtain another • plot with a sequence which is identical to the ROT sequence with the exception that it depicts object motion with no rotational component (plot 2 in Fig. 10d). These two plots can be compared to see the effects of rotation on z plot. Figure 10 shows the results obtained with frames 1 and 2 of the TAXI sequence. The moving object in this sequence contains several regions with uniform gray levels. The overlapping homogeneous object regions disappear from the difference picture and this fact contributes to the larger value of ¢. The r plot obtained for the TAXI sequence (plot 3 in Fig. 10d) shows a gradual increase of T with time, indicating the presence of rotation. This plot is compared with the one (plot 4 in Fig. 10d) obtained from a sequence depicting the translational motion of a homogeneous object. The consecutive frames in this sequence contain overlapping object regions. The S image developed from the estimated translational component of motion for frames 1 and 2 of the TAXI sequence is shown in Fig. 10c. It contains 2- and 1-pixels in the overlapped regions of the object which have disappeared in the A R picture.

SEGMENTATION OF TEXTURED DYNAMIC SCENES

257

2222222222222~3333n111111111111111111111111111 ~2~2222222222233333311111111111111111111111511

~222~2222~72~333311111111)111111111111111111 I111i1111111111111111111111111111111111t111111 1111111111111111111111111111111111111111111111 1~2~2~22222~272Z33111111111111111111111111

"~o ,oo

22222212222~23~3222Z~3111111111111111111111111 ~222222222~3333~22~33333t111111111111111111111 2232222~2~333333333333333111111111111111111111 2~2~22233323333333~33~111111111111111111111 222~222~2~3333322~33~33111111111111111111111

2272~2~22~3333373~3333111111115111111111111 22222~2~2~333333"33333333111|11111111111111111 22222222~2233333333~3333111111111111111111111~ 222222~222223333333333311111111&11111111111111 22~322222922~333333333111Htlltl11111111111111

%;

22~2222~22~2~33333331111111111111111111111111 2222222222222233333331111111111111111111111111 22222222222222333333~1111111111111111111111HI ~22~2222~22~233333331111111|1111111111111~111

22~22221~Z233333~31111111111111111111111111 72~2 2 2 2 2 2 2 2 ~ 7 3 3 3 ~ 3 ~ 3 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 222~ 7777~23333333t111111111111111111111111 ~2~ 222222233333331111111111111111111111111 ~22~ ~222222333333311111111~1111111&11511111 222~2~222333333311111111~1111111111111111 2222222222222~3333~33111111111111111111111115 222222222222~3333333111111|tllilllltl(ll~ 22222222222222333333311111tll~llllllllltlllltl 222~2222222~2233333331111111111111151111111111 ~222222222222223333~31111111111111111111111111 Ill111111111111111,151111r.1111111111115111111 111111111111 ................ |lllltlllllflltlll tltlllll$1l .............. 1111111111111111111 )ltlllllll$ ........... 11111111111111111111 .................... II|llllllllltlllltll IIII 11111111111111111111 Illllltll 111|1111111111111111 rllllltllll ............. 11111111111111111111

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

. . . . . . . . . . . . . . .

b CCCCCCCCCCCCCCWCCCBWCCCCCCCCCCCCeCCCCCCCCCCCCC ccccccccccccccwcc~9~cccccccccccccccccccccccce CCCCCCCCCCCCCCWCCgB~CCCCCCCCCCC¢CCCCCCCCCC¢CC CCCCCCCCCDCCCCWCCagWACCCCCCCCCCCCCCCCCCCCCCCCC CCCCDDDDCODDCCWDCBBWBCCCCCCDODCCCCCCCCCCCCC¢CC CCDDDDDDOODDODDW~WWBCDDODDCDDDDCOCC~CCCCC~¢CC DDDDDDDDO,DOOOODDDCDCODDOODDDD~DODOI~CCDODD~CCC DDDDDDDDDD~DDDDDDCEDCDDD~DDDDDDDDD~DDDOD~Y~OOC ~DDDDDDDDDOODWWD~TAIW~COODDDOODOOODO'DDDI~DOCCOC COODDDDDDDOODWDWTGTWGICDODDDDDDOOODO0000000(~C OCODOODDDDD~WCOWSSTWSSWIBDDDODDDOODODO00~t0t~¢~C O¢ODDODODDtDDA65~ItCCG551tDDDOOOOOODD~ DCDODOOOODWDDA65=W~DDB45WDDODOOODDOCDOODDCC~CC DCDODDDODOIDD85WCDO~DC54WCDODDOCDDOCCCCCCCCC¢¢ CCDDCCDDDDWDD75WC~D~CC641CCCDCCCCCCCCCCCCCCCCC CCCCCCCOODWDD851CCCVCC54WCCCCCCCCCCCCCCCCCCCCC CCCCCCCCCCWCCASWSWWCC9441CCCCCCCCCCCCCCCCCCCCG CCCCCCCCCCCWCB74179B?44WBCCCCCCCCCCCCCCCCCCCCC CCCCCCCCCCCCWCC94333341CCCCCCCCCCCCCCCBBIB88CCC CCCCCCCCCCCCCWCC83333WCCCCCCCCCCCCCCCBI~SB~SCC BBCCCCCCCCCCCCWCB4333WCCCCCCCCCCCCCGGBBF/I18BS~CC BBBBBCCCCCCCCCWCCG33~B~CCCCCCCCCCCB6BSBB~BBISB BBBBBCCCCCCCCCWCCT44~CCCCCCGCCCCCCCBBBBBBSIIBCC BBBCCCCCCCCCCC~CC744~CCCCCCCCCCCCCCCCCBBBI~IIECC BBBBBCCCCCCCCCWCC744~CCCCCCCCCCCCCCCCCCBa~BCCB BBBBBCCCCCCCCC~CC?44~CCCCCCCCCCCCCCCCCCCCCCBBB BBBBBBBCCCCCCC¥CCa44~BCCCCCCCCCCCCCCCCS~CBSIIlUl 888ABBBCCCCCCCWCCB44WBCCCCCCCCCCCCCCCBBBB~ 777gBBBBBCCCCCWCCg44WBCCCCCCCCCCCCCCSBa~B~BBS~ 7788ABBBBBCCCCWCCB44WBCCCGCCCCCCCG¢CCBBBBeSBBa 7778AABBBBBCCBWBB844~BCCCCCCCCCCCCCCC~BBES~BBB 77?BAAABBBBBBBIBB855~BCCCCCCCCCCCCCCBCBBBBItlIBB 777799AABBBBBB~BBB54WABBCCCCCCCCCCB~CBBBBB~BBB 7777899A~SBBBBWflBG4641BBBBBCCBBBBgSBBBBB~¢~BSB 7777789A&AABBBIBBG474~BBBBBBBEBBB§BBBBBBI~B~BBE 7777709SAAAAAAWBBTn4WgBBBBBBBBBBBBBBBBBBBB~BBB 77??TBggAAAAAA~AR44~gBBBBBBBB~BBBBBBEBBBB~AAA 77777gBggAAAAA~WWWWWWAA~BBBBBBBBBBBBBBBBAA~AAA 77777789~gAAAAAA~B777655gBBBEBBBBBBBEA&AAAAAAA 7777778999ABCEEDBS§G~CBSBAA~BBBBBBBA&AAAAAAAAA 7777778gg99KBB87788AGETTA~AABBBBSBAAAAAAAA&~AA 8887678BBBBDBgABAgATCFAgAAAAAABBBBBAABAAAA~A~& CCg667gBCBgAGIdlCB98BGEgggAAAABABBBAAAAAAAA~AA ~SONPORRRROKLMPRHAgSCDF9999AAAAAAAAAAAAAAA&~A& QQRRRRRRRRPLLNORGATTAAHg~g99AAAAAAAAAAAAAA~AA9 QQRRRR~RRRQOLMPPBTSBBAGA99gAAAAA~AAAAAAAAA~A99

C

~CCCCCCCCCCCCCCCCCBVCCCCCCCCCCCCCCCCCCCCCCCCCC ~CCCCCCCCCCCCCCCCg~CCCCCCCCCCCCCCCCCCCCCCCCC ~CCCCCCCCCCCCCCCCSBVACCCCCC¢CCCCCCCCCCCCCCCCCC WC¢CCCCCCDCCCCCCCBgVACCCCCCCCCCCCCCCCC¢CCCCCCC ICCCOODOCDDDCCODC88~BC£CCC£~DBCCCCCCCCCCCCCCCC WW~WWWWWWWWWW~WWWW~COOnO~COOOOCOCCCCCCCCDCCCC DODDODDODDDDDDODDCDCODOOODODOOODDOODCCDDDODCCC

D W W )W W W IW W W IW IW W W W W W V W C D D D D D O D O D D O D D D D D D D D C C D C W~DDDDODDDDODDDC16776~CODDDDDDbDDODDDDDDDDDDEC WCDDOODDDDDDDCgT557755W~BDODDDDDDDDDDDDDCOCCCC ~COODODODODOD~GSGCCCCSSS~DDDDDDDDDDDODOODDDCDD ~CDDDDDODDODOA65BDDDDBdS~DDDDD[~OD~CODOD~CCCCC ICBODDDODDDDDf156CDDDDC54~CDODOOCDDDCCCCCCCCCCC WC~DCCDDD~DDOTSECDOOCCG4WCCC~CCCCCCCCCCCCCCCCC WCCCCCCDDDDDOB55WCCCCCS4WCCCCCCCCCCCCCC~CCCCCC WC¢CCCCCCCCCCASWBWCCC944WCCCCCCCCCCCCCCCCCCCCC WCCCCCCCCCCCCBT4~TSS744~BCCCCCCCC~CCCCC~CCCCCC ~CCCCCCCCCCCCCC9433334W~CCCCCCCCCCCCCCBBBBBCCC WCCCCCCCCCCCCCCCS33331CCCCCCCCCCCCCCCBBBBBBBCC IBCCCCCCCCCCCCCCB4333WCCCCCCCCCCCCCCCBBB~BBBCC WBBSBCCCCCCCC¢CCCG33~BCCCCCCCCCCCCBBBSBBflBBBBB IBBBBCCCCCCCCCCCCTdd~CCCCCCCCCCCCCCBB~BBBBBBCC WSBCCCCCCCCCCCCCC~44~CCCCC¢CCCCCCCCCCCBBB~BBCC WBBBBCCCCCCCCCCCC744WCCCCCCCCCCCCCCCCCC~BBBCCB ~BBBBCCCCCCCCCCCCT44WCCCCCCCCCCCCCCCCCCCCCCBBB WBBB~BBCCCCCCCCCC844WBCCCCCCCCCCCCCCCCBBCBBBSB W89WBWBCCCCCCCCCC844W~CCCCCCCCCCCCCCCBBBBBBBBB W77~BBWBBCCCCCCCC~44111~CCCCCCCCC~C~CC~S~B~BB ~TB~gD~B~CCCCCCCB44~CCCCCCCCCCCCCCCEBBBBBBBB 177WAABIBBBCCBCBBBJ4~BCCCCCCCCCCCCCCCCBBB~)BBB

WTTWAAAWBBBBBBBBBB55WBCCCCCCCCCCCCCCBCBBBBBBBB W777WWAWBBBBBBBBBB$4W~BBCCCCCCCCCCBBCBBBBBBBBB 1777BB~ABBBBBBBBB464WBBBBBCCBBBBBBBBBBBBBBBBB WTT7789AA~BBBBBB@474~BBBBBBBBBBBBBBBBBBBBBBBB WTTTTfB l gAAAAAABBBd l 4WgBfB l BBBfB l BBBBBBBBBBBBBBBB T i ~?TBggAAAAAAAA&B44~gBBBBBBBBBBBBBBBBBBBBB~A WWWWW W I WWWWWWW~ W I ~WWWAAABBBBBBBB lfBBBBBBA~A~A 777777BgBgAAAAAABBT77~55gBBBBBBBBBBBBAAAAA~AAA 777777f1999ABCEED85565C~SBAABBBBBBBBAAAAAAAAAAA 77777789999KB~B77BBA6177AAAAB~BBBBAAAAAAAAAAAA 8BB767888B~DBgABABATCFAgAAAAAABBBBBAABAAAAAAAA CCgG679BCBgAOIdlCB~SBGEgg~AAAABABBBA~4AAAAAAAA

550NPORRRRQKLHPRHASGCBF9999AAAAAAAAAAAAA~AAAAA OORRRRRRRRPLLMQRGA77AAH99999AAAAAAAAAAAA~AAAA~ Q~RRR~RRRROOLMPPBTSflS~GAgggAAAAAAAAAAAAAAAAAg9

d

FIG. 8. Results with PIC sequence (for frame pair 2 and 3): (a) MIM graph, (b) S image, (c) frame 3 with object boundary marked (object mask developed from 3-pixel cluster), and (d) frame 3 with object boundary marked (object mask developed from 3- and 2-pixel clusters).

We can obtain only approximate object masks from the S images available as we are neglecting the rotational component. We need to develop a method capable of handling rotation, such as the rotate-match procedure suggested in Section 3, to obtain better results. Here again, we may be facing a multitude of candidates for object masks and need additional knowledge sources to assist in selection. We are currently working in this direction.

JAYARAMAMURTHY

258

AND JAIN

i

a ii

i

:ii:]]:iii:i[i]:i::iiiii[[iiiiiii[;;;;;iiiill . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33333333339933333393

. . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . .

33333333333333333333

. . . . . . . . . . . . . . . . .

~3333333333333333339 *33933333933933333333

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . ........... . . . . . . . . . . . . . . . . . . . . . .

..........

*°*****o--

.......... "33333333333333333333 . . . . . . . . . . . . . . . . . 339333333fl33333333331 . . . . . . . . . . . . . ~33933333933333333331 . . . . . . . . . . . . . . . . . . . 333333333933933393339 . . . . . . . . . . . . .

333333333333333333333

. . . . . . . . . . . . . . . . ,,.,339333393333333339333 . . . . . . . . . . . . . . . . . . .

3333333933

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . .

• ..,*o**°0

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......

• 33939933393933~33339 • 933933333339333333333

. . . . . . . . . . . . . . . . . . "333333933339333333333

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

3333~33333339333933~3 . . . . . . . . . . . . . . . . . . . 333333333333333333333

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

...............

. . . . . . . . . . . . . . . . . . . 333333333333333333331

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . .

IV

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . **33933333333933333333 . . . . . . . . . . . . . . . . . . "3333333339333~i333333

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

e333333333333333333333 "333339333333333333339

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . 339933333333333333339*

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3

**°

° o . , * o ° . *

. . . . . . . . . . . . . . . . . . . 333333333~3333 . . . . . . . . . . . . . . . . . . . . . . 3333333333333333 . . . . . . . . . . . . . . . . . . . . 1333333333333333333 . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . 33333333393933939333:)

. . . . . . . . . . . . . . . . . . . 33333333333333

. . . . . . . . . . . . . . . .

333333333333333i

3333333333 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . • . . ° . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . 3333333399

. . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . 99333333333333 . . . . . . . . . . . . . . . . . . . . . 333333333939333933

. . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . 33933333333333939339

iii

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

.........

. . . . . . . . . . . . . . .

3333333393333333 133333933333393~9933

**333333333333333333333

. . . . . . . . . . . . . . . . . .

*333333333333333333333

"333333333333333333331 . . . . . . . . . . . . . . . . . . "333333333333333333331" . . . . . . . . . . . . . . . . . . . 33333333333933333331 . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . 3333333333333333 . . . . . . . . . . . . . . . . . . . 33333333993333

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

?333333333

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . .

b FIO. 9. Results with ROT sequence: (a) MIM graph for frame pair 1 and 2, (b) S images for frame pairs, (i) 1 and 2, (ii) 1 and 3, (i]i) 1 and 4, (iv) 1 and 5.

5. DISCUSSION

In this paper we have presented an approach for segmentation of dynamic scenes which combined the strengths of local and global methods. We isolated the nonstationary regions using a differencing operation and with the application of more global method such as a Hough transform we extracted the motion parameters for each region. In the segmentation phase, we resolved the local ambiguities on the basis of global consistency and developed masks for the moving objects. It has been shown [11] that the results that can be obtained by differencing and local analysis are in general approximate and are suitable for the peripheral phase. On the other hand the application of a global method such as Hough transform to a scene containing several moving objects may not yield useful results because of the potential interference of the individual peaks in the parameter space contributed by

SEGMENTATION OF TEXTURED DYNAMIC SCENES ................. ......... ............ .......... .........

It . . . . . . . . . . . . . . . . . . . . . . . . . . . . '. . . . .

1tlt11 .................................... 1111111111 ........ 111 ....................... 1111|111t11111..,1111111t ..................... |ltl|ll ...... Itt1111111t111 ....................

....... 1111111 ......... ..... ti11111 ............. .... t11111 ............ ...11111 ................. ,.f11111 . . . . . . . . . . . . . . .

. .

111tt1111111111 .................. 11111111111111 ................. ..,It|1111111 ..................... 1111111111 ..................... I11111111 ......................

.111111 ..................... t1111 ....................... 11111 ....................... III1,,,,; . . . . . . . . tflt .......................................... 1111.,~ ........ 11 ............................. 1t ........... 11111 .............. • . ,

z+

. . . .

: . . . . . .

~1 . . . . . . . . ltl ....... II ......

I ........... 11111111| I .............. ~ ...... 1111111 ..... I1 .......... 11t11111111 . . . . . . . . . . . . . . . . . . . . 11111111 ...... III ........ 11t1111~111111 ................. 111111111 ..... 111 ........ 11111111111111 .............. 111111111111 ..... 111111 ..... I11111...111111 ............ 1111111..,111 ...... I11tt11.,..111111,..111111 . . . . . . . . . . . 11111111,1,111t .... 1111111111111111t..,111111 .......... IIIII111,.,1111 .... 11111111111111151.,.111111111 ...... 111111 ...... 11111 .... II1111111111,,,,111111111o,.111..,111111 ........ 11111111 .11111111111,...11111111 .... I1111111111 ......... I1111111 ,,,1|111,..t....1111ff11 . . . . tlltltf/ff .......... flflllfl ..,1111 ....... 1111111111 .... 1111111111 ........ 11111t111, ..,I .......... 1111111111 .... 1111111111 ....... 11111111t.. ...I ........... 11111J1111...Ig1~111111 ...... JJllltlJl,.. ........ 1 . . . . . . . . . . . . 11111111111111111 ..... III1111tl .... ......... II ........ 1.11tlt111111111111....11111111 . . . . . . .... I.,.,11 ............... 11111111111111111111111 ....... ......... 1111 .............. 111111111111111111111 ........ ........ 111111 . . . . . . . . . . . . . III111...11111111111 ......... ...... 111111111 ............ 111111.,,1111111111 .......... ........ 111111111 . . . . . . . . . . . . . 11...,111111111 ........... ......... 11111111 ............. II ....... II11 . . . . . . . . . . . . I ........... I111 ....................... III ............ 11

a

. . . . . . . . . . . . .

ff,..,~f

.............. . . . . . . . . . . . . . . . . .

.........

........

.......................... . . . . . . . . . . . . . . . . .

..........................

.... ....

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1....11

................ It I 111111111 "3111111111111111,.11111,11111111111111, I11111111111,,~**33311111t11111.,t111111111111111111111. I11t1111111.'***333333111111.,*'*111111111111111111,111 1111111|1.''*'333333333311+***.*.33111111111111111|11t1, i111II1-'*'333311222233~'*'*'****''1111111t11111111 111111.'**333QII22~22~23U'**~***333'''.11111111111~t

259

..............................

I11.1,1 .... ~1 . . . . . . . . . . . . . . . . . . II....IJH$1~ .................. • ...... I1t111 . . . . . . . . . . . . 1111111 ............. . . . t . . . . . . . t1111111 ......... r.,,,t11111 .......

If1

11111 111111 1111111 . . . 11111111 1111111111 11111111111 It111111tltl

. .

.

b

I(1f+***3333f12222112222233"*'''333333*.,fllf11;flflff. 111 ***333112~3211222~22~33"****333~222..t111111t11111, 11.'*'3311222~112~21122233*'*,o333223221..111111111111

J ***J331222~f222~l~2222~333~.3333222222~,.,liHjlfp~

PLOTS

1***3331221122=211~2~113222333331=22222~211.111111111. "*333112~2~2221122~2LI22222333332222222222211..1111ttlt. *3331122~122112,,21~2222112222222221222222221.**1[tt111,

~333222112221..**1~22211222222222222222222222.**3111111. 33132211~2212' .... .211222211222222222122~2222**'33111~1 33222222~11~' ...... *3321122222222222222222.*'*'33~1111. ~'2~1222112.**+'3333*331~2221111122222212~.''3333332111 *+*222=2~2.'**'33333*'333211111112222222~'*3333333~211. ***12212221+*',33323.+,,,.11111111122223~*333333**32221. ,,,33322222,+**332:2 ...... 211111t11222,+**333222,3322~2. "'*33Q3QI~23''*332~2****''.1111111111"***333323233~3222. ***'333033333*'~3~2***333...11111t'''333332221"3~322~.

c~

P--Q" pL(Yr

3

PLOT

4

pi,trr

1

PLqI"

2

*'~*'33~33333+3~22~33+3333**1/If1.**3333~22222333~322,. *~*'''33~3332222333333333~22"'3111''333~2~2~2Q333"**" 13''*'3~33~22~*'333332222''*****'33322221122~333'**'"

~"

|1.333"1~=22~333~3333"32222"33333",3322=22221333"'3333. 11.3.1.1~22233~3~333332322"333333"33~=22~2133"*'33332. 11.al22 11222323333333333.21"333333"33=Q323=33"''3333~2. 1|..1211*112222222~22333333"'333333*332~2233*''3333222, 111.1212~'312~22233233333'*'333333,'3322233''333322232. 1111"11213+~122222222222223,*a33333',333333",333322~222. 1111.11~1333312322~222222223''333333~3333"'333332222222. 11|11111~333331222322222222333333222,***'33333322222223 1111113"33=333'122222222322333333222"''333333322~222222.

~

c~.

~"

f11111113"3333333232222222222333222~*'*3~333323~22;222. 1111111123~*'3333132222222223233~222223333222322~2~2" I111111111~3"3322232;112222222.222=2~223332222=222222~3

~" ~

f1|Iffl/l12223322~233111/2222~22.,2~2=~222222222~3"" 111111111111223222~33111112223 ..... 222~233222222222333'' 1111111111112~22"3322231111**....2~222=232222222323 .....

c~.

J

111]~11'1~1112~2"~22~3~333"*..2222+2232322~222333333" III111111tllt122~2*222222333333..32222.2=2222=2~3"33333" I111t1111111111122,22222223~333*+1|112..32222~33333333~ ' IIIIII1111111111~2..+2222222333"'''312 ..... 223333*33333* ..................... 222222... *'**** ....... *. ..... ****3

C

Q

+

I-2

+

I-3

I-4 FRBM~

,

,

I-5

1-8

PfllR

FIG. 10. Results with TAXI sequence: (a) MIM graph for framepair 1 and 2, (b) AR picturefor frame pair I and 2, (e) S image for frame pair 1 and 2, and (d) ~"plots.

different moving objects. We resolved this difficulty by successive refinement. A crude segmentation based on differencing is used to determine the motion parameters o f individual segments, which in turn is used to obtain a more refined segmentation. Here again we assign various confidence factors without overcommitting ourselves and with the availability of external knowledge sources, it is possible to obtain an even more refined segmentation. In our quest to combine the strengths of both local and global approaches, we have e m p l o y e d several heuristics which have yielded successful results when applied to m a n y test scenes. We have not completed, however, rigorous studies on the b e h a v i o r of these heuristics.

260

JAYARAMAMURTHY AND JAIN

We wish to emphasize the fact that the results are obtained without requiring the prior segmentation of frames. Correlation-based and segment-matching approaches for dynamic scene analysis require the image of the moving object in at least one frame [16, 20] and perform poorly in the presence of occlusion. T h e presence of texture in the scene m a y make segmentation of the frames imperfect a n d m a y mislead the subsequent processes. W e depended solely on m o t i o n to obtain segmentation which is the reason why this approach performed well in a textured environment even in the presence of occlusion. ACKNOWLEDGMENTS The T A X I sequence was obtained f r o m Professor H. -H. Nagel. T h e p h o t o g r a p h s of the R A N D sequence were obtained with the assistance of Professor Ed Delp. The authors are thankful to both of them. The authors also thank Susan H a y n e s for several constructive comments. REFERENCES 1. J. K. Aggarwal and R. O. Duda, Computer analysis of moving polygonal images, IEEE Trans. Comput. C-24, (10) 1975. 2. S. M. Anstis, Phi movement as a subtraction process, Vision Res. 10, 1975, 3. S. M. Anstls, The Perception of Apparent Motion, Philos. Trans. Roy. Soc. London, Ser. B 290, 1980. 4. D. H. Ballard, Generalizing the Hough transform to detect arbitrary shapes, Pattern Recognition 13, (2) 1981. 5. s. T. Barn~rd and W. B. Thompson, Disparity analysis of images, IEEE Trans. Pattern Anal. Mach. Intell. PAMI-2, 1980. 6. R. O. Duda and P. E. Hart, Use of the Hough transform to detect lines and curves in pictures, Commun. ACM 15, (1) 1972. 7. C. L. Fennema, and W. B. Thompson, Velocity determination in scenes containing several moving objects. Computer Graphics and Image Processing9, 1979. 8. I. L Gibson, The EcologicalApproach to VisualPerception, Houghton Mifflin, Boston, 1979. 9. R. Jaln and H. -H. Nagel, On the analysis of accumulative difference pictures from image sequences of real world scenes, IEEE Trans. Pattern Anal. Mach. Intell. PAMI-I, (2) 1979. 10. R. Jaha, W. N. Martin, and J. K. Aggarwal, Segmentation through the detection of changes due to motion, Computer Graphics and Image Processing U, 1979. 11. R. Jain, Extraction of motion information from peripheral processes, IEEE Trans. Pattern Anal. Mach. Intell. PAMI-3, (5) 1981. 12. S. N. Jayaramamurthy and R. Jain, Segmentation of Textured Scenes Using Motion Information, IEEE Computer Society 9th Workshop on Applied Imagery Pattern Recognition, September 1980. 13. S. N. Jayaramamurthy and R. Jaffa, Segmentation of Textured Dynamic Scenes, Proceedings, Pattern Recognition and Image Processing, 1981. 14. J. D. Limb and J. A. Murphy, Estimating the velocity of moving images in television signals, Computer Graphics and Image Processing4, 1975. 15. W. N. Martin and J. K. Aggarwal, Computer analysis of dynamic scenes containing curvilinear figures, Pattern Recognition 11, 1979. 16. H. -H. Nagel, Formation of an object concept by analysis of systematic time variations in the optically perceptible environment, Computer Graphics and Image Processing '7, 1978. 17. J. O'Rourke, Motion Detection Using Hough Techniques, Proceedings, Pattern Recognition and Image Processing, 1981. 18. T. Pavlidis, Structural Pattern Recognition, Springer-Verlag,New York, 1980. 19. J. W. Roach and J. K. Aggarwal, Computer tracking of objects moving in space, IEEE Trans. Pattern Anal. Mach. lntelL PAMI-I, (2) 1979. 20. t. W. Roach and J. K. Aggarwal, Determining the movement of objects from a sequence of images, IEEE Trans. Pattern Anal. Mach. Intell. PAMI-2, (6) 1980.

SEGMENTATION OF TEXTURED DYNAMIC SCENES

261

21. S. Tsuiji, M. Osada, and M. Yachida, Tracking and segmentation of moving objects in dynamic line images, IEEE Trans. Pattern Anal. Mach. Intell. PAMI-2, (6) 1980, 22. S, Ullman, The interpretation of structure from motion, MIT Press, Cambridge, Massachusetts, 1979. 23. S. Yalamanchili, W. N. Martin, and J. K. Aggarwal, Differencing Operations for the Segmentation of Moving Objects in Dynamic Scenes, Proceedings, 5th International Joint Conference on Pattern Recognition, 1980.