Experiments on picture representation using regular decomposition

Experiments on picture representation using regular decomposition

C0~eUTE~ Ga.~P~cs xNe I~c.~G~ PROCESSIN~S, 68--105 (1976) Experiments on Picture Representation Using Regular Decomposition ALLEN ~[(L]NGERAND CHARLE...

2MB Sizes 0 Downloads 58 Views

C0~eUTE~ Ga.~P~cs xNe I~c.~G~ PROCESSIN~S, 68--105 (1976)

Experiments on Picture Representation Using Regular Decomposition ALLEN ~[(L]NGERAND CHARLES1:~. DYEI~ C,omp~ter Science Department, ~chool of Engineering and Applied Science, University of California, Los Angeles, California 90024 Communicated by A. Rosenfeld

Received December 23, 1974 The problem of building a computer-searchable data representation for a complex image and the effec~of representation on algorithmsfor scene segmentationinto regions is considered.A regular decompositionof picture area into successivelysmaller quadrants is defined, which involves logarithmic search. This hierarchical search and resuls picture representation are shown to em~bierapid access of image data wiI~hontregard t~ position, efficient storage, and approximate structural descriptions of constituent patterns. Examples involving solid geometrical obiects and alphabetic characters are given. 1. INTRODUCTION Processing pictures by computer involves two central problems, segmentation and recognition. The segmentation task involves identifying subsets or extracting obiects of interest from a scanned (digitized) picture. Implementation of this phase (preprocessi.ng) means defining a process or set of processes to reduce the amount of picture data, since digitized pictures contain far too many points for a meaningful description of a scene ES]. This report focuses on the development of a preprocessing technique which is general enough to cover a large class of problems, and yet easy to implement in terms of computational complexity, ,~torage requirements, and extensibility. To reduce the difficulty of subsequent recognition procedures, information must be discarded which is irrelevant to the current goal of classification. This data reduction can be achieved in two ways : 1. by restricting the problem domain to a limited number of possible inputs, and 2. through data reduction heuristics designed to eliminate particular kinds of irrelevant information. Pattern recognition problems are goal-directed problem-solving activities which share the property of data "over-richness" and the need for a useful data representation with other areas of complex computer decision-making (artificial intelligence). In the pictorial domain, all decision-activities involve seach. One of us [1,383 proposed to facilitate two-dimensional search by a regular decomposition 68 Copyrlgh~ ~ 1976 b y Acade,mc Press, Inc. All rights of reproduction in a n y form reserved.

PICTURE

REPRESENTATION

USING

REGULAIg

DECOMPOSITION

69

procedure to delete large noninformative areas of a picture or other array. In this paper we show how the resulting condensed picture can be:

1. evaluated by a top-down approach using a new algorithm which retains more marginal boundary data, 2. processed from a resulting data structure into recognition-oriented units such as isolated objects or properties (symmetry, orientation). Harmon [-361 has shown that area-partitioned information is suflieient for people in the recognition of human faces. The human visual system has many low-level operations which occur for all contexts [-9, 28]. These "front-end" operations re&tee the eombinatories, and hence complexity, of human scene analysis by eliminating all but a relatively few areas in a search for objects in a scene. A subsidiary purpose of this paper is to show that the decomposition scheme of geographic partitioning yields similar front-end context-independent qualitative information regarding picture e,haracteristics such as pattern orientations, sizes, and shapes. Separation of objects from their background is basic in pattern recognition and scene analysis F8, 14-, 29]. A large class of algorithms employs edge detection and line fitting procedures for direct object identification [30-32]. Others have used region-growing methods and subsequent identification of separate objects [19, 33-35]. However, such algorithms seem to suffer in two prominent ways. First, they involve a complex and exhaustive point-by-point search of the entire picture domain to identify objects in a scene. Second, these preproeessing techniques inw~lve "bottom-up" segmentation: the use of raster point values to build up a glebal picture deseription. Local noise and other aberrations greatly influence the control and ef[ieiency of such algorithms [-19]. Regular decomposition has been employed by researchers in computer graphics, scene analysis, '~rchitectural design F6], and pattern recognition. Warnoek's [-3-5] hidden surface elimination algorithm subdivides successively finer picture squares while searching for areas which are simple enough to display graphically. SI~I's mobile automation [7] utilizes a "grid model" which similarly subdivides the automation's visual environment to an arbitrary degree of precision for determining the feasibility of a proposed journey path. To motivate the decomposition schcnle we will discuss processing a digitized aerial photograph. Large areas of the original scene are rolling green hills with little information content. Most information is in a portion, such as a town, which can be viewed as a complex subproblem consisting of subsets which may be objects such as buildings, swimming pools, and parking lots. After these are assimilated the processing can move to other areas of the picture looking for subordinate structures such as an intercity road. This way of processing the photograph spends little or no time looking at simple areas. Complex areas define the contents of the photograph and are analyzable as subproblems, each requiring a solution. These subprobIems are reduced into further subproblems until they are either solved or a time limit is reached. The body of this presentation is devoted to the description of the improved regular-decomposition image-processing algorithm. It is used to obtain informs-

70

h-~INGER AND DYEI~

tive (complex) portions of a picture in several examples, for which hierarchical picture representations were obtained. This daga structure and its potential for efficient handling of scene processing functions are discussed. Specific labels are defined to aid in traversing the data structure (essentially a tree of quadrants of the original picture, each possibly subdivided into four successors). Label values indicate the extent of informative parts of the picture within a given quadrant. This information is used to declare some subquadrants of adjacent quadrants to be neighborquadranls (possibly containing extensions of the object in the quadrant which is to be kept at finer picture decompositions, i.e., deeper in the tree). Heuristics for searching neighbor quadrants for added linkage information arc described, as are methods for merging quadrants into regions to approximate structural scene relationships (e.g., orientation). Finally, computational examples are discussed and measm'ements of storage, computation time, and object detection and separation ability are given. 2. REGULAR

DECOMPOSITION

The concept of the decomposition algorithm is : 1. Represent a digitized picture as spatial subsets of differen~ size marked either "informative for scene description" or "noninformative." 2. Discard picture elements (pixels) that belong to "noninh)rmative" subsets, Note that this contrasts with procedures which test whether picture elements "~re part of an object. Initially, the entire digitized picture (a two-dimensional array of gray level m' light intensity values) is a quadrant. Three possibilities exist when ~he algorithm looks at a quadrant: 1. Nothing informative is contained there. (This will be the case for large homogeneous areas. Such areas may be eliminated from the data structure without loss of picture information.) 2. A large amount of information is found in the quadrant. (Many lines, vertices, and regions with diverse textures are found. Picture elements in the quadrant should be saved in the data structure of the reduced picture.) 3. An intermediate amount of information is present (not enough to make a definite decision). If the algorithm fails to make a decision about a picture quadrant, it is subdivided and then each of the four subquadrants is processed by the same procedure. The subdivision process is applied recursively until either no failures occur or else the quadrant size becomes equal to the smallest resolvable point of the picture (one pixel). The process of regular decomposition is thus a logarithmic search for picture areas where there is "informative" data present. The algorithm builds a tree by hierarchically examining a picture's contents. Each area is assigned an importance based on how informative it is judged to be. Measures of information may be coarse--involve total intensity or total local pseudogradient--or may involve complex calculations or previously stored templates. We will discuss functions

PICTURE

I~EPRESENTATION

USING

REGULAR

DECOMPOSITION

71

__J__

Fro. 2.1, 3. diagonally oriented object,

for discriminating picture information after an example which illustrates the regular decomposition process. Figure 2.2 shows the regular decomposition tree which results from a picture containing a diagonally oriented object located along the smallest squares in [,'it. 2.1 Dashed tines ira Fig. 2.2 represent regions whose pLxels are eliminated ~nd these are slmwn as nonsubdivided areas in Fig. 2.1. The decimal labels of nodes in Fig. 2.2 correspond to picture areas (squares of different sizes in Fig. 2.1) in a way tha~ is discussed in detail in Section 3. The choice of an appropriate discrimination function is clearly one critical issure in this decmnposition process. If the function is relatively simple, then few subquadrants may be processed and only a small amount of information discarded. If the function is complex, then more quadrants may be processed and computation per quadrant may be high. However, there cannot be a general method for defining area importance. Informative values must depend on context and also on the required picture description E8]: importance must be defined in both syntactic and semantic terms. Two simple methods have been tested so far which utilize only syntactic information.

1. Thresholding the picture itself. Intensity of grey level is the coarse picture importance parameter. [-This is the natural approach in several cases: automatic character recognition (black characters, white page); chromosome analysis (stained chromosomes darker than their background); reconnaissance photographs (clouds whiter than terrain)]. P

I:'Aj'~"

9

// ~I1

PA'D A"

PAAB PADB PAAC PAOC PAAD PAD{)

"'P.D

PDDA PeAS PDOB PDAC PDDC PDAD PDDD

Fro, 2.2. Regular decomposition tree for Fig. 2.1.

72

KLINGEI~AND DYEK A['i,l) A(2,1) A(5,1) A(4,t)

A(1,21 A(2,21 A(321 A(4,2

Fro. 3.1.4

X

AH,5) A(I,4) A(2,51 A(24) A(33) A(3,4 h 1413) A 14,4)

4 mab'ix k.

2. Edge and curve detection. The number of edges is the picture importance parameter. (For pictures not composed of line drawings, well-known methods, including calculating and thresholding the local pseudogradient E8], can be used to isolate edges.) A quadrant being subdivided ham previously been labeled "intermediate amount of information." Such a quadrant is treated as a new unknown picture which is to be partitioned. The importance of all four subquadrants is computed relative to their parent quadrant. With intensity as the coarse picture parameter, we define the relative importance d(x, y) of a subquadrant :c within a quadr-mt y "m

d(x, y)

intensity of subquadrant x =

Intensity of quadrant y

Thus, d(x, y) represents the proportion of information of quadrant y, f(mnd in subquadrant x. (A measure of importance relative to the surrounding area is convenient since it is independent of the depth in the tree.) The discrimination function we will use will be two thresholds of the relative importance of quadrant x in picture y. In terms of the threshold values wt and ~.02a quadrant with:

'l.v2 ~ d(x, y) <~ 1.0 will be termed informative; wl <~ d(x, y) < w~ will be termed not sure; 0.0
STI%UCTURES

3.1. Trees and Digitized Pictures Digitized pictm'es are commonly represented as two-dimensional arrays, where each element of the array contains some information about the corresponding A A(I~) "

A(2, ~ }

All,l} A(2,1} h(~ ,2) A(2,2) AH,3) A(2,3) A(1,4) A(2,4)

A (3.*)

"A(4,~)

A(3,I) A(4,1) A(3,2) AI4,2) A[3,3) A(4,3) A(5,4} A(4,4)

Fro. 3.2. Tree of matrix A decomposedby rows.

PICTUI~E I~PI~ESENTATION USING REGULAR DECOMPOSITION

73

A

uI~per lell

corner

upper rlgh~

cornel

lower lefl

A{1,1) A(t,2)At2,1) A(2,Z}

corner

lower rlghl

ca(net

A(5,5) A(3,4) A(4,5) A(4,4)

Fro. 3.3. Tree partition of matrix A by areas. area of the image space being viewed. It is well known that such an array can be thought of as a speci~l case of a tree structure [37]. For example, Figs. 3.1 and 3.2 give two representations of a 4. X 4. matrix, the latter explicitly displaying [,he relationships of elements in the same row in a tree. However, this tree does not display all the structure of the matrix (column relationships omitted), and a silnilar tree based on column relations omits row information. Since picture elements are usually obtained h'om horizontal raster scan lines, row-oriented data structures and algorithms utilizing line by line "slice~" ~re common in picture processing [13, 19, 4-0]. However, gener-~l properties of pictures classes are unlikely to be represented in linear (row) form [17], since key geometrical, topological, structural, and metric constraints arc usually involved. Because a data structure should reflect prior knowh'.dge about properties contained ~ithin the data base, representations which facilitate computer search for picture properties in areas should be preferable. A tree structure which directly reflects areas (rather than rows) for the above 4. X 4. matrix is shown in Figs. 3.3 and 3.4.. This strtteture can be conveniently represented for computation by the Dewey decimal (library oh.ssifieation) notation: i 1.1

A upper loft corner

1.1.1

A. (1, i)

I.i.2

.4 (i, 2)

1.1.3 i.i.4

`4 (2, ~) `4 (2, 2)

1.2

upper right corner

:

1.4.4

A (4., 4.)

An image can be represented by a tree containing only nodes where a sub-

quadrant has been found to be important or nonterminal (not nonimportant). Hierarchic levels within the tree contain information regarding the structure of

upperleft

upperr,ghI

lowerlef,

lowerright

FrG. 3.4. One Ievel of regular deeomposition.

74

K.LINGER AND DYEE

patterns in the image (e.g., see Figs. 2.1 and 2.2, where the "upper-left-corner diagonal p a t t e r n " can be recognized when node P.A is reached and the presence of only the successors P.A.A and P.A.D noted.) Component parts of patterns may be identical (e.g., letters "P" and "B") so that tree partial similarities may be a useful recognition aid. This leads to a decision to build algorithms which traverse a reduced picture's tree by preorder [-373. Visiting the nodes of the tree by preorder permits all successors of a node to be examined before it is examined. Since successors represent smaller picture zones, the tree and a preorder traversal algorithm break the overall scene analysis task into the solution of a series of subproblems.

3.2 Trees and Regular Decompositio~ The data structure of Fig. 3.3 is a regular decomposition of the picture area by successive partitioning into quadrants. The result is a tree where each f~Lthcr node has at most fottr successor nodes. These are ordered arbitrarily by: upperleft, upper-right, lower-left and lower-right. Following the notation of ['1, 38Z, these subquadrant areas will be called A, B, C, and D, respectively, and the alphabetic label attached via the Dewey system notation to locate the successors of a node. This is illustrated in Fig. 3.4- for a single level of decomposition and in Fig. 3.5 for six levels, and formalized by the following definitions. DEFInITiON 1. A Q-tree is a finite set of nodes which is either empty or consist~ of a quadrant and at most four disjoint Q-trees. This recursive definition of Q-tree is analogous to Knuth's definition of binary tree ~37~, except here each node has exactly four subtrees.

/

NUMBER O F N O D E S A T L E V E L i = 4 n M A X I M U M TREE DEPTH FOR B4 x 64 A R R A Y = 6

~

TOTAL NUMBER OF NODES IN TREE =i~141 = 5461 NODES

/

FIo. 3.5. Complete regulra' de(,'o,npositiontree to level 6 : Le~dimdes are pixels if P is digitized ~o a 64 • 64 array.

1 I(,IUI.IT!] .I~EPIIrI~SENTAT[ON USING 1LEGLILAIL D E C O M P O S I T I O N

75

TABLE 1 Picture Quadrant Contents Locating Procedure Pe++,:i,d~+re:

C+SN2/ERT(p, m, J~); s

a n tl. m n t) i++hirr

t.bez'rP A ~ I ,+.lI+(+'.iJ ~ J

.

T h e .d+m,+]ute

~;.;+~.x+z'c [+++]t'd(++.lg:++++ +~+t' t+++'+:[++.: t h e +It++2+tr:~tl['+ Ii+++-[ : +c+++ :IT2 ~:he p ~ : t t l l ' e at++ i'++tmd h~, i:+i+.+ I+],..] - t t ] ~ + l>rfloptlt+1*/+, +i'h+ hit)t]t+]+)+l+ ++++all a i ; d C.s i ] +pet+++[r+ .,n ~I I+]I+IIlIII:IPPJ+' '+ l:I' [IV ++ ++t PL2P ++t
i[__ ]'UaII{]I)

= '1 I'

yad;

:Lmax

.~[ I14];i<[[+)}

~]tel~

= m:

x2N~x

+ t+;

+ +A'

".rod+.

th(m

: I I++r~rf['~L+~ill

+ :..fla-~)

:]:

Am~; h,,,ut ( l q

'it' ~'t,'t+~_2+

".l,I[l~

12+ hP+;,ffpJ 1]~; % itiiIl

iPla + I

((,

r1~+r+

~

9

i1]~r

2) :

',"

'

], ~4++]x

,N,[I[(:r IIIIH + "(rl+a+) d): t +lit+l, [ t, +]]+[L , :~[l+[l~) ':+) ]

e't+(~ ;

it~ II4:iiL+(il }

'I}' ~ h l q ~

+~l)Ji+

q+ = t+rl[J

t'-

[

td+l

.

l,lp~]~
111+11 + v t++~]

~+}+

(tl/)~

:: l+li+)l ,: l;'d~, X%I,I~II ,[+$(~ :~iilaX ll~+ di~['s l~lh)lll'~lll~ Ill}t) I ~ [liil hl th+P l}].:[)]llf$ ~:p~lihp,

Lh+' ~ttt,Oltltl,

relict CI~[Wt:RT~

I)EF[NITION 2. A quadrant is the roo~ of a Q-tree. Quadrants arc nodes which can be labeled by a set of properties C. D]~r[NITZON 3. A picture is a quadrant labeled P. DEF[NITLON 4. _A+leaf quadrant is a quadrant whose successor Q-trees are Ml empty. DEFINITION 5. A reduced picture is the set of all le~f quadrants in a Q-tree whose root is labeled P. DEFZNITION 6. A regular decomposition of a picture P is a Q4ree with P at the root a~d the reduced picture of P at the leaves.

Informally, a quadrant is an image area, rectangular or square, characterized by relative position, size, and intensity. The absolute loeation of a quadrant in the picture is obtainable from its Dewey decimal label 1-39, 43~ : see the procedure in Table 1 ; however, passing down properties C as the tree is established is easier and quicker for computing quadrant position: C = (Intensity, x_ min, x_ max, y_ rain, y_ max} The properties in C can be recomputed for the four successor subquadrants by Rule 1, which divides each side of the quadrant in half and assigns new vertices to the subquadrants using the functions floor (k J) and ceiling (~- 7) to obtain integer-values of boundary line positions and hence disjoint subquadrants (see Fig. 3.6).

76

KLINGE[~

AND

D~Eit

Yomln

•o[max

Xamln

O.,A

~.C

I

aiB

l E(x~l. + •

]

[(Xamln + Xamax)

/2]

a.D

X~max

Fro. 3.6. B o u n d a r i e s of f o u r s u b q u a d r ~ n t s of q u a d r a n t ~ defined b y R u l e 1.

RULE 1. Let a be the label of a quadran% whose properties are C~ :

Ca = (I., xa n-fin, x. max, y. rain, y. max}, Then the root-lubcis ~nd properties of its four subqu~drants ~re t. a.A C~.A =

2.

Ca..

<

..... o:,x i = Yo-4 mill f =

x,~.,t

rain

I(i, j),

x, rain,

Lx.mill + .ca,-axJ ,

I(i, ,]),

.r. mill,

,

2

Oz.~

= ] = Y,,.~a

~ min i ~ x . . . rain

[Y,, Inin -~ ya ln~xl, 2 3.

o~,C

C~.c =

z(i, j), \ ] ~ y~.c mln i

=

min + :c~ max], 2

~:~.a rain

/

Za l'n&X~

y~ rain, 4.

Ca.,, =

ya max>.

-

2

Cg.~


E

= y~.D min i = z~,~ rain

z ( i, j) ,

FXa m i l l

- ~ X~ l n ~ x ]

2

/

2:a I I l a x 7

yy~ mi~ +2 ya ln~x],

y~ max>.

.

PICTUR.E [{EPI{ESI{NTATION USING IIE(kUL.~k[{ DECOMPOSITION

77

The program which performs regular decomposition implements the definitions : tL picture P to be processed begins as an m • n digitized image; Rule 1 is applied to it rccursively and the discrimination function of Section 2 is used to decide which, if any, of the four successor Q-trees should be included. If u (d(x, y)) = noninformative, ~hen y's successor Q-tree with root labeled x will be empty. If ~(d(x, y)) = informative, then y's successor Q-tree with root l~beled m consists of the single leaf quadrant x. If #(d(x, y)) --- not sure, then y's successor Q-tree with root labeled x is a Q-tree consisting of at least one node; here, Rule 1 will be applied to quadrant x as it was to its parent quadrant y. The final result is a tree structure with P as the root node, "not sure" quadrants m~king up the nonterminal nodes and "informative" quadrants comprising the leaf nodes. Picture elements of P within the boundaries of the leaf nodes are retained. These constitute the reduced picture.

8.3 Unconditional Partitioning to a Tree Level Digitized pictures containing on the order o[ 10 ~ picture elements (1024 X 1024 array), which c;mnot be stored in fast memory, e~m be processed by unconditional decompo~i ~ion followed by sequential processing of all the subpictures so obtained. The prec(~ding techniques, including the discrimination function, can then be ttpplicd to each subpicture. For exmnple, let P be a 1000 X 1000 array, where caeh pictur(~ element is stored in two bytes, so that to store P, 2000 K bytes of storage are nec~ded. For 200 K of fast memory, .~n unconditional partition of P to loyal 5 (P.A.A.A.A.A...P.D.D.D.D.D) yields 102~t subpictures whio]~ could each be rcguh~rly decomposed. A global picture description could then be built up using the subpicturcs' trees. Preprocessing to obtain the subpietures via the unconditional p~rtitioning is necessary to obtain ;~rea coherence for subsequent processing. Although a conventiona[ raster scan data-base could be more easily partitioned into "line subpictures" to lit into fast memory, picture information is not linear. Quadrant subpictures represent compact ~reas of picture dements. There, points generally have all their eight-neighbor points as well. (For example, in a picture containing m X n points, only 2 (m + n) -- 4 of these do not have all eight-neighbor points present, while no points in a single raster scan line h~ve all eight.)

3.4 Advantages el l~egular Decomposition The advantages of regular decomposition in image processing are: 1. Via unconditional partitioning, pictures which are physically too big to store in fast memory ~t one time can be processed as a sequence of subpictures extracted during preprocessing. 2. Regular decomposition enables addressing for rapid ,~ccess to any geographical part of the image. 3. Regular decomposition retains explicitly in the data structure a hierarchical description of picture patterns, elements, and their relationships. Hence, this scheme m a y ~Iso be used in conjunction with syntacticpattern recognition algorithms [22-25].

78

KLINGEIg AND D~_IlI.

4. Representations pernlit recm'sive analysis of subpictm'es. 5. The decomposition algorithm contains major routines (traversal, tree-creation) which are independent of image class. Small changes can adapt regular decomposition to ~ddely different types of pictures. 6. The resultant tree data struc~ure distinguishes object from nonobject (or background) and enables processing to locate separate obiects. 4. NEIGHBOR QUADRANT8 The nlotivation for introducing neighbor quadrants is due to two observations regarding the regtdar decomposition scheme. First, decomposition into quadrants is arbitrary. Picture areas and the dividing lines imposed may combine to slice single objects into fragments which could be separately "noninformative." This was observed in tests by Omolayole [41], where absence of some edges found in the original image led to incorrect linkage information about objects. Second, to detect and extract objects from background (distinguish relevant data), the discrimination function uses prespeeified parameters. For some values a reduced picture ~dth incomplete information may result (i.e., objects present in the original image are unrecognizable or completely absent in the reduced picture). Search of neighbor areas should deal effectively ~dth both factors and yield improved reduced pictures. The regular decomposition algorithm has been defined as a decision process involving quadrant a and its four subquadrants (a. A, a. B, a.C, and ,.D). A tree node a remains "active" if at least one successor subquadrant is added (found "not noninformative"), and this is evaluated by twolevel processing. The algorithm resembles a front-end "field of vision" which can be bolstered by neighbor quadrants via a three-level algorithm. Table 2 shows quadrant configurations (i.e., the informative or "not sure" subquadrants) and the corresponding neighbor subquadrants which were used in experiments discussed in Section 6. Other sets of templates could be used to detect specific-shapes. The following example motivates our selection of these neighbor subquadrants. Refer to quadrant c:onfiguration 5 in Table 2 and let c~ be the en(;Iosing quadrant. Then,

d(~. A, ~) ) w~, d(a.B, a) < wl, d(t3~.C, ~) < ll)l, d(oe.D, a) ) wi, indicating a general diagonal orientation of the underlying object(s). This suggests that the neighbor subquadrants ~. B. C and a. C. B are likely areas to search for additional picture information, since these fill in the arbitrary partitioning of a into only two subquadrants a. A and a.D. This is shown in Table 2, by crosshatching, where the level-three border-softening quadrants may become "active" if any or all of them pass a revised evaluation test. The neighbor quadrants added to the tree must have a dummy father node inserted (their true ancestor node must be "noninformative," hence not contained in the tree).

PICTURE

REPRESENTATION

USING R.E(2ULAH. DECOMPOSITION

79

'gABLE 2 A Set, of Neighbor Subquadraat Template~SuggesLedby the Coarse Structm'al Configurationof Quadrants Ouodranl conf kJurot mn

~

Ne~cjhbor subquodJ'ont s

TABLE 2--,Continu~d OuOdronf confiqurohon

--~.~

NelghDor ~ubquodronfs

T h e revised function for determining neighbor subquadrant importance is defined relative to both a . A and a . D in this example (configuration 5, Table 2). Since successor nodes contain information regarding orientation within a quadrant

80

KLINGER AND DYER

~he discrililinatiuli function ~ for neighbor quadrants is the average imporbance value of "informative" and "not sure" quadrants. This yields the relative informative v i u e of the area and enables rating neighbor quadrants by crossquadrant information. A more refined technique would be to evaluate only those local pixels which are on certain borders of quadrants, as indicated b y the specific quadrant configuration. However, the increased computation time required to improve the decision process did not seem warranted and consequently this global (averaging) method was used. For example, with intensity the coarse picture parameter, we found intensity of neighborhood subquadrant a . B . C d ( a . B . C , {a.A, ~.D}) --

(intensity of ~.A + intensity of a . D ) / 2

Because subquadrants contain fragmentary information which involves linkag~ of quadrants, the thresholds w~ and ~v2 in the discrimination function ~ shmfld bc lowered : informative, u,i - ~ ~< d(x, {Yl,.,., y}) ~< 1.0,

~

~(d(x, lye, . . . , Y,~})) = Jnot sure,

[noninformative,

~v~- e <~ d(x, {y . . . . , y}) < ao~ - e,

0.0 ~< d(x, {yi . . . . , y~}) < wi - ~,

where e is the desired percentage decrease in the threshold constants w, and ~v,,. After each application of Rule 1, the appropriate template in Table II is found by the program and the neighbor quadrants inspected. If d (z, {y~,..., y,~}) /> ~vl - e, then that subquadrant is added to the tree in the normal manner as either a "not sure" or "informative" quadrant with a "dummy" father node. Subsequently, this neighborhood quadrant will be treated no differently from a quadrant obtained directly b y Rule 1 during further decomposition. Different values of the threshold parameters ~v~, ~v2, and ~ were tested for several image classes and the experimental results are discussed in Section 6. 5. REGION APPROXI~IATION USING QUADRANT CONNECTIVITY Once the picture tree structure has been built, further processing can obtain a description of objects or connected areas in the scene in terms of a specific subtree. A single object in the image is now covered by one or more adjacent leaf quadrants. Thus, the problem of recognizing objects reduces initially to the problem of deciding whether two leaf quadrants belong or do not belong to the same object. Further processing is necessary when a single leaf quadrant covers several objects. To obtain leaf quadrants which could be from the same object it is necessary to "prune" the tree. The reduced tree must retain only subtrees whose nodes correspond to spatially connected quadrants. This is done b y first sorting to obtain geographically separate groups of leaf quadrants; this follows directly from the actual configuration of subtrees in the data structure, as can be seen from ~he example of two objects and their resulting tree structure (three distinct subtrees) of Fig. 5.1. After the sort groups the leaf quadrants, further processing can be done within a group, or of two related (e.g., by symmetry) groups.

PICTUIs

REPRESENTATION

USING

ItEGULAI~

DECOMPOSITION

81

Fro, 5.1. Tree structure of two regions: three sub~rees which can be merged.

Implementation of the pruning procedure to group leaf quadrants which are geographically connected was done by quadrant rather than by using the specific tree configurations. Quadran%s are connected into groups using the eight-neighbor conncctedness criterion discussed by Rosenfeld [10, 11, 26]. The following definitions provide a formal description of the quadrant accumulation method actually implemented in the program. D~FiNrrro~ 7. Given a point Pc(i, j), the eight-point neighborhood of P0 is {(i - 1, j - 1), ( i - 1, j), ( i - 1 , ] + 1)> ( i , j - 1), ( / , j + 1), ( i + 1, j - 1), ( i + 1, j), ( i + 1, j + 1)}. In Fig. 5.2, PI, P',, . . . , P8 represent these points. D~FINImtoN 8. Two quadrants Q1 and Q~ are connected if there exist any two points P1 e Q1 and P~ e Q~ such that P1 is contained in the eight-point neighborhood of P~. D~r~ImIo~ 9. Let X = {Q1, Q~, . . . , Q~} be a set of extracted picture quadrants. Then a region of quadrants (or region) R is a subset of X such that R = {Q~i, Q~, . . . , Qi,~}, where {il, i2, . . . , ira} cc_. {1, 2 . . . . , n} and (Vj: 0 < j ~< m) ((31c: 0 < ]~ ~< n) (i~. ~ i~, Qo.eonneetedto Q~k) A ( ~ 3 P E { X - R}) (P connected to Qis). Quadrants are connected into regions by checking the border points around each quadrant. Using the vertex information contained in C for each quadrant (X rain, X max, Y rain, Y max), straightforward application of Definitions 8 and 9 to the set of extracted picture quadrants yields connected picture regions from leaf quadrants. Assume that regular decomposition resulted in a tree structtu'e containing quadrant leaves Q~, Q2, . . . , Q~, where m i> 1. Then after merging connected quadrants there are R~, R~, . . . , R~ regions, where m/> n/> 1, and these regions define the gross topological picttu'e patterns [-27~.

PI

P2

P3

P4

PO

P5

P6

P7

P8

Fto. 5.2. Eight neighborsof P0.

82

KLINGEI~ AND DYEI~ 6. EXAMPLES

OF APPLICATIONS

A number of pictures were "hand-digitized" in 64 X {J4 and 32 X 32 array~ of grey levels, ranging from 0 to 9, to illustrate the features of regular decomposition. Programming was done in PL/C at the UCL_A_ Campus Computing Network and UCLA Health Sciences Computing Facility: the program listing is in [-443. This section reports the resu]ts of computational experiments designed to determine the success and efficiency of the ideas presented here over different images and parameter vaIues. The reduced pictures discussed here are shown in the Appendix and were produced by a line printer using overprinting. The examples presented here are block scenes and alphabetic characters. Intensity was the coarse picture importance parameter used [38]. Five parameters, Wl, w~, e, with/without neighborhood quadrants, and the starting level in the tree for applying the discrimination function, were varied in the experiments. The program efficiency is represented by the tree size and density (number of nodes present divided by number of nodes in a complete tree of same depth), data reduction (percentage of picture elements deleted), and information-saved TABLE 3 Computational Statistics Max.

%Plc

~

Pic. Compact, Ratio

Image

Name letter~

8b

yes

.10

.25

.05

l

4

341

19

12

30

I00

0

,20

Lechers

Be

no

.i0

.25

.05

l

2

21

7

4

25

82

18

.18

letters

8d

yes

.i0

.25

.05

2

4

341

35

20

27

i00

0

.16

letters

Be

yes

.i0

.50

.05

i

5

1365

87

48

14

i00

0

.03

letters

8f

yes

.20

.40

.I0

!

5

1365

21

12

28

96

4

.18

blocks

4b

yes

.i0

.25

.05

1

5

21

12

38

i0~

0

,28

blocks

4c

no

.lO

.25

.05

1

3

85

15

8

36

94

If)

.28

blocks

4d

yes

.iO

,25

.05

2

5

1365

40

~1

22

I00

0

.13

blocks i

4e

yes

.i0

.50

.05

i

7

21845

51

28

32

I00

0

.22

blocks

4f

yes

,20

,40

~

1

6

5461

25

14

33

I00

0

.23

polyhedron

ib

yes

.i0

,25

.05

1

4

341

35

22

53

99

0

.15

polyhedron

lc

no

.i0

.25

.05

1

3

85

23

14

5O

93

4

.15

yes

.I0

.25

.05

1

2

21

17

12

75

93

7

.34

l~nes

lib

llnes

_ ii=

lines

lid

1365

no

.i0

.25

.OS

1

2

21

17

12

?5

93

7

y..

,10

.15

.05 I

2

4

341

141

9~

75

lo~

v

t

,34 ~

.zo

PICT1H~E I~EPRESENTATION USING

REGULAR DECOMPOSITION

83

TABLE 3~

Letters Lett,ers Blocks Blocks

Corresponding figure

w~

Maximmn tree depth

% increase

8d 8e 4d 4e

0.25 0.50 0.25 0.50

4 5 5 7

25.0 40.0

(percentage of object elements retained) statistics. The experimental results and parameter values are given for all examples tested in Table 3 and the follo~dng discussion refers to spot entries in that table. An explanation of Table 3 follows and some sections of the table arc included in text. Colmnns 1 and 2 in T'~ble 3 label the specific pictorial example shown in the Appendix; there the input and extracted pictures are shown. Columns 3-7 give the values of the five program parameters. Columns 8-11 give information about the picture structure resulting from regular decomposition: the maximunl tree depth of the structure (col. 8) ; the maximum number of nodes which could be in a complete Q-tree of that depth, computed ~'%~ 4 ~, where n = maximum tree depth (col. 9) ; actual nmnber of nodes in the picture structure (col. 10) ; and the number of those t h a t make up the extracted picture (col. 11). All the blank parts M' the extracted pictures are points which have been eliminated from the picture structure by decomposition. Column 12 gives the percentage of picture area saved and c,olumn 13 tells what percentage of the original image intensity has been kept. These two statistics summarize the data reduction capabilities of the algorithm. Column 14 describes approximation error computed as the percentage of object points lost in the reduced picture (i.e., the percentage of picture points part of some object; wMch have been eliminated h'om the d a t a structure). ["Object-information-lost" approximation errors, and "noninformative-pixels-saved" approximation errors are combined in column ].5 (see below).] To include both types of approximation errors we have defined a "picture compaction ratio." This ratio is obtained by adding the number of picture points TABLE 3b

Letters Letters Blocks Blocks Polyhedron Polyhedron

Correspending figure

Neighbor quads

% object area lost

No. of nodes

% pioture area deleted (storage saved)

8b 8c 4b 4o ]b ]e

yes no yes no yes no

0.0 ] 8.0 0.0 ] 0.0 0.0 4.0

19 7 21 15 35 23

70.0 75.0 62.0 64.0 47.0 50.0

84.

NLINGEP~ AND

DYER

TABLE 3c

Letters Letters Letters Blocks Blocks Blocks

Corresponding figure

w, - wl

IVIax.tree depth

No. of nodes

8b 8f 8o 4b 4f 4e

0.15 0.20 0.40 0.15 0.20 0.40

4 5 5 5 6 7

19 21 87 21 25 51

which should have been saved (but were not) since they art part of some object plus the number of points which are really noninformative but have been kept in the extracted picture, divided by the total number of points in the original image. The experimental results are listed in column 15. A perfect extracted pfl;turk would have no informative points lost and no nonlnformative points saved, yielding a 0.0 compaction ratio. The worst structure would contain only the noninformative points and none of the informative ones, yielding a ratio of 1.0. The discussion of the figures themselves follows. The ordering of figures in each appended example is : the input digitized image, a series of extracted pictures which use various values of the five parameters, the geographically separate regions, and finally the leaf quadrants that make up each region, for one extracted pictm'e. Figure la shows a 64 X 64 image of a polyhedron. Figures lb and lc show the extracted picture quadrants after decomposition with and without neighbor quadrants, respectively. Table 6.1 shows the considerable reduction in the size of the tree and the marked increase in the number of informative edge points of the polyhedron that were lost as a result of not checking neighbor quadrants. Figure 2 shows the quadrants and their properties, C, that make up the single region in the extracted picture of Fig. lb. The trace of decomposition given in Fig. 3 illustrates the recursive algorithm and the addition of neighbor quadrants into the tree building process. Figure 4a shows solid objects of the type commonly used in scene analysis experiments [3I]. Fig~ares 4b-4f illustrate the variety of extracted pictures we can obtain by varying the parameters. Figures 4b and 4c used identical parameter values except that 4b inspected neighbor quadrants while 4c did not. The utility of this heuristic is clear as 6% more of the pictm'e intensity and 10% more of TABLE

Letters (Fig. 8b) Letters (Fig. 8d) Blocks (Fig. 4b) Blocks (Fig. 4d)

3d

Start level

h{ax. tree depth

No. of lmdes

1 2 l 2

4 4: 5 ,~

19 35 21 40

PICTURE

I{EPRE,~ENTATION

USIN(I I%EGULAR.

DECOMPOSITION

85

TABLE 6.1 Neighbor Quadrant Utility for Polyhedroa No. of nodes % picture in t r e e intensitykept, Wighout neighbor quadrants (Fig. 1e) With neighbor quadrants (Fig. lb)

23 35

% ol~]ect area lost

93 99

4 0

the obj oct area were saved by ehecldng neighbors. Figures 4b and 4d used identical parameter values except that 4d started the discrimination function at level 2 of the tree instead of level 1. Starting at a lower tree depth forces the program to look, at finer areas; consequently, a more exact description is obtained, at the cost of checking and keeping a considerable larger tree. Table 6.2 supports this conclusion. Figure 4e has an enlarged "not~ sure" zone and consequently the search is forced much deeper into the tree in order to find "informative" areas (max depth = 7 instead of only 5 for Fig. 4b). Figure 4f illustrates the increasing of the e parameter from the other figures and the slight effect it also has on increasing the maximum search depth in the tree (max depth = 6). Figures 5-7 show the quadrants and their properties, C, that define the three regions in the picture. Figure 8a shows the digitized picture of the alphabetic letters B and 0. Figures 8b and 8c demonstrate graphically the need for a neighbor quadrant search. Figure 8b is the extracted picture of 8a with neighbor quadrants and the letters B and 0 have been saved with no information lost. Figure 8e is without neighbor quadrants and 18% of the intensity associated with the two letters has been deleted, resulting in what now appears as the letters E and C. Table 6.3 further compares experimental statistics associated with the neighbor quadrants parameter. Figure 8d starts the discrimination function at level 2 and hence produces a much "fuller" tree (35 nodes to 19 for Fig. 8b) while not gaining much in terms of the picture compaction ratio (0.16 to 0.20 for Fig. 8b). Being more selective in deciding what is "informative" by increasing *v2 to 0.50 as we have done in Fig. 8e produces a remarkably good picture compaction r~tio of 0.03 while not discarding any of the informative picture elements, but quadrupling the tree size (87 nodes to 19 for Fig. 8b). Figure 8f shows the effect of making it relatively easy to discard information by letting w2 = 0.20, while making the E parameter a relatively "loose" 0.10 to pick up neighbor quadrants. The results are ahnost TABLE 6.2 Effect of UnconditionalPartitioningto a Levelfor :BlocksImago Starting levef

Nc~.t,f nodes

~:.~picture area kep~

% i~b]ect Pi~.turecompaction area losl. ra~io

[ (Fig. 4b)

2]

38

0

2 (Fig. 4d)

40

22

0

0.28 0.13

KLINGER AND DYEh'.

86

TABLE 6.3 Neighbor Quadrant Utility for Blocks Image l'vIaximum No. of nodes % picture tree depth area kept Without neighbor quadrants (Fig. 8c) With neighbor quadrants (Fig. 8b)

2 4

7 19

25 30

% object area lost 18 0

identical to those for Fig. 8b in structure (21 nodes, 12 leaves in Fig. 8f and 19 nodes, 12 leaves in Fig. 8b) and in success (0.18 picture compaction ration in Fig. 8f and 0.20 in Fig, 8b). Table 6.4 compares the sensitivity of the tree size with storage reduction and object retention for varying threshold values. 7. DISCUSSION OF t~.ESULTS The feasibility and efficiency of building picture tree structures using regular decomposition by quadrants has been shown. The running time c~f the Mgorithm is determined by the complexity of the scene rather than by the picture size. Consequently, regular decomposition is most practical for applications inwflving sparse picture information. This includes such large domains as char'~cter recognition, chromosome analysis, time-series analysis, and line drawings. Because the algorithm is quite general, it can be used to obtain a convenient data structure for a wide range of different pattern recognition and scene ~malysis tasks and picture types. Numerical parmneters and a discrimination function must be specified and these can be tuned to the class of pictures under consideration. The initial motivation for adding neighbor quadrant search was to "soften" the regular decomposition boundary lines, The added search generM]y improves algorithm performance (though at a considerable cost in increased computing time). For example, in the letters image of Table 6.4, varying parameters had little or no effect on object retention : the use of neighbor quadrant search balances out significant degradation from wide variation of threshold values. Thus, neighbor search could enable use of a single set of parameter values to preproeess a d~ta set. TABLE 6.4 Sensi|~ivityof Threshold Values and Their Effects on Storage Reduction and Object Retention for Letters Threshold values 'Wl

0.10 0.10 0.20

Maximum No. of tree depth n o d e s

W~

0.26 0.05 0.,50 0.0,5 0 . 4 0 0.10

4 ,5 5

]9 87 21

% storage % object Picture reduction arealost compaction ratio 70 86 72

0 0 4:

0.20 0.03 0.]8

Correspending figure 8b 8e 8f

PICTUI1E REPRESENTATION USING REGULAR. DECOMPOSITION

87

The experimental results reveal the following parameter sensitivities (these are supported by condensed tables, derived from the appended Table III). 1. If w: is increased from 0.25 to 0.50, then the maximum tree depth searched is increased substantially: 25%-40%. (Intuitively, this is as expected since increasing w~ means more selectivity in choosing "informative" quadrants.) (See Table 3a.) 2. Including neighbor quadrants in the extracted picture structure decreased the percentage of object area lost (the effect was a range of from 4%-18~7o); increased the tree size (range, 40%-170%); and the effect was only to increase overall storage requirements slightly (from 0% to 5%). (See Table 3b.) 3. Increasing w2 - wt from 0.1.5 to 0.40 resulted in an increase in the maximum depth of the tree of from 25% to 40% and an increase in the number of nodes in the picture structure from 100% to 300%. (This is intuitively appealing since we are widening the "not sure" zone in the discrimination function, forcing more quadrants to be subdivided.) (See Table 3e.) 4. Unconditional partitioning to one lower tree level increases the tree size 80%-700%. (See Table 3d.) 8. CONCLUSIONS

AND

FUTURE

WORK

The program discussed in this paper used a top-down recursive partitioning of picture area into successively finer quadrants to obtain resulting tree structures for several types of images. The trees contain information on such key global properties as symmetry, shape, and orientation of constituent objects or patterns ; experiments were done to merge areas, using the tree to locate the objects. Quantitative results were obtained from alphabetic letters, blocks in a scene, and the simple structures of a polyhedron. We showed that segmentation errors caused b y regular decomposition can be overcome in all cases by a relatively simple algorithm which uses the same type of concept. Neighbor quadrants were defined and, at the cost of increased processing time, the percentage of intensity of the picture kept after decomposition using the neighbor algorithm improved. While much needs to be done to develop a working system for image processing from these concepts, the work presented here makes it likely that such a development should take place. Several refinements to be presented elsewhere should continue the process of building a theoretical basis for a practical image processing system. Future work includes examination of tree structures to determine whether translation and rotation of an object, can be overcome b y processing the data ~btained b y regular decomposition.

APPENDIX: SAMPLE PR.0(IItAM RUNS ACKNOWLEDGMENTS This research was sponsored by the Air Force Office of Scientific Research, Air Force Systems Command, USAF, under Grant No. AFOSI~-72-2384. The United States Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation hereon. The authors express their appreciation for this support.

88

I(LINGE[/, AND Dglgll.

XXXXXX~XKXYXXXX~XXXX~YXX~XYXXXX• .......................................................... ~ ..... % .......... + ..................................... 0---+ ........... X .......... X ................................................................. X ................................................................ X ................................................................ X X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X----~-- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ~% ~ ~<'~ ~ % ~ ~ ~ ~ ~ . . . . . . . . . . . . . . . . . . . . . .

X X XX X X XX X X

X. . . . . . . . . . . . . . . . . . . . . .

X

X . . . . . . . . . . . . . . . . . . . . .

~30~ X

%~ ~'~ ~ ~ 1 ~

~

.

~

M

X . . . . . . . . . . . . . . . . . . ~.%. Z X X ~ ~ ~

X. . . . . . . .

- . . . . .

~

'

~

~

~

M

'$N % ~'4 ~ 8 X X ~
X ..............

~ ' ~

~~

• ..............

~ g ~

~I3X,4~!~

X . . . . . . . . . . . . . .

B

H

~

u

~

1

6

~

X

~

. . . . . . . . . . . . . . . . . . . .

~

. . . . . . . . . . . . . . . . . . .

%~% ~ % ~ X ~ ' ~ M

~ g~ ~ g~

X ~ X ~ ' K ~ O

Z ~

X~ X ~

'i~ ~ ~l~ M Z ~

X ~

~

~

~

~

X

. . . . . . . . . . . . . . . .X .

0~@@@

~

5

. . . . . . . . . . . . . .

X

..............



..............

X

. . . . . . . . . . . . . .

X

X . . . . . . . . . . . . . .

~ H W ~ H H ~ @ @ ~ # ~ # # @ @ @ ~ M M u

. . . . . . . . . . . . . .

X

X. . . . . . . . . . . . . .

~

. . . . . . . . . . . . . .

X

W

~

X . . . . . . . . . . . . . ~. @

~

~

@

B

~

~

@

@

~

~

@

~

@

~

M

1

6

5

1

6

u

M

u

2

. . . . . . . . . . . . . X.

X. . . . . . . . . . . . . .

mggm~g~m@~u165165

. . . . . . . . . . . . . .

X

• . . . . .

~

. . . . . .

x

~. . . . . . . . .

~

g

~

@

u

+--+

. . . .

X . . . . . . . . .+ . . . . ~ ~ m ~ m ~ m ~ @ ~ .~ . . . . . . . . . . . . X. x - - - + . . . . . . . . . . .~ ~ @ ~ ' ~ m ~ @ @ ~ ~ @ ~ § . . . . . . . .X X - - + ........... B ~ ~ ~ @ ~ ~ + O ........ X X. . . . . . . . . . . . . . . ~ ~ g " ~ ' ~ g ' L ~ @ . ~ " ~ @ ~ ~ ' ~ ' ~ " ~ ' @ ~ @ ~ _ _~ x . . . . . . . . . . . . . . • . . . . . . . . . . . . . . X..............

~ B ~

e m

X. . . . . . . . + . . . . . . . B X . . . . . . . .- . . . . . . . . ~ X~--+--~ ........... ~

X.

. . . . . . . . . . . . . . . . . . . . .

~ @ ~ g M

~ ~

~ ~

o ~

~

~

~ o

~

~

~ f

f

i

u w

~ @

~

~

@

o

@ ~ ~

~ ~

u

~ @ ~ @ ~ @ @ ~

• . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

@ ~

~

o ~

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .................

X X X

~

. . . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . . . . . . X. . ..................... X

. . . . . . . . . . . . . . . . . . . . . . . .

? . . . . . . . . . . . . . . . .

~ . . . . . . . . . . . . . . .

X X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X . . . . . . . . . . + . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ,. . . . . . . 0---+ . . . . . . . . . . .

X X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X" ,

X X

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X

XXXXX~'KXXXXXXXKXXXXFXX~XXXKFXXXX~XXXFXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

FIG. la. Digitized picture of a polyhedron.

PICTU[LE 1LEPILESENTAT~ON USINC~ ]LBGUL~kl~,DECOMPOSITION

8!)

X X

/ .<

x X

. :~. :':

X X

.............. . . . . . . . . .

X ..........

-":'.~':'v:.-.~'.~.~,'$:~';i'."~';~:'~I;~,

X

.... - - - - . - r -



~,~

. . . . . .

• X

~- y" ;~ 2 ~ '::: ;t ]~ ':~ :~ '~' ~: !I 9' ~, ! ;?; ~'.i ~ ?t ~'i ~ 2 ,~I ~ '! !i : '~ ~,i:~ . . . ::.'~t.!.~ :'.- ~;::'~ ,~ :~ ~ :~ ?i,~,':'; "~ ~'~ ;,' .~, .'~ ; i . ~' p, v ::.~ !: ,,.., ~;.:. ~:~j . . . .

~ , ~

~

~:~:,"~:~vP:~ ~ ~ ~ ~ ~ ~ l F ~ " ~ k . ~ - - ~ ~

~

X

--

X

--~

X

--

~ ~ ~ ~1~1~1~'~ P] ~ ~ ~ ~ ~ ~ tl e] l] ~ gJ ~ ~

I'J~ ~i~,~ ~l~

X .

X X

9 ~ :~'~

~ ~ --

"):~ ~ 8.~,~ ~- ~' , ~ r

~. ~ ~ ~ / e ~ ~: ~ ~ ~: ~: ~ ~. ~ ~ ~ ~ ~ ~ ~ -~ ~ ~. ~_ #~ ~ ~.~ g g ~ ,

.........

:(. .g

~,

. ...... :~

.

qi ~ - , -

-

...............

-

Y. x

'. ...... :,~

X ~..~

..................

v

~ -~ .... W'I-,-.--~-

...... '.................

X X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - --.---

"7 . . . . . . . .

":-.- ....

~ , ~ ' ~ ~ ' @ ~ ~ @.~':):~~ : ~

e,~ ~ ~ ~ . ~

FJM g J ~ # ~@~ ~ t~ ~ 0 ~ - ~ ~ O ~

~ ~#~- ~ ~ @~, ~ ~ e_~2 ~_~ ~

. . . . . . . . . . . . . . . . . . . . . . . . . . .- . . . . . . . . . . .

. . . . . . . . . . . . .

~XX

.............

--

-v . . . . . . . . . . . . . .

" _ _ L - - _ ~ . - -

=---~.

. . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ~ X )r

"

X

. . . . . . ~<

. '

~.:~g@;-~-.~~'~:~'~-- .................

....

X

'.'/' ,~ ~ ~: c~~- ~ 9 ~ 4 ~

.

~ ......

--~.~,b':'i~,~4:'~l::~-l:~4,-~:)~f~'~e~;.]~-,~m~

:~ ~ ~ i~ ~

--,

-~g ~)~-.-,--.~

@~mL~

- - - - M ~ n~ [fl ',-i~, .~ ~ ~ ~: ..~ ~-~ ',~ # ~,~ ~ :, ~ :~ ~ ,~, ~v ~ ,~, ~ ~ ~ .~ 8 @ ~

.--

. ~_

: '4"~,~;~Z'I~.~.:C~.I:':;%.~FA~,~I-=--~-..

~ , ~ ,.3 ,-#~'~ .~ ~ ~ W C~:~')::): ~ ~ ~: ~ "~ :.~ ~ ~' ~ :~-'): ~

X

.

X X A ~ ~ x y . ; < X :< ;, .:4 X X ,~ >. X ,<.4.,~ :", '<,',.;~,4 X ~ ,'I, ~ ;,: :'~> < . ~ X

~ ": ,"
X X X '/. X

< "< ~ ;,,'~' X,.l,', ,,,.,,4 ~..4

FIG. lb. Retained polyhedron--neighbor algorithm. Parameters : wz = 0.10, w2 = 0.25, e = 0.05, level = 1, neighbor = yes. Percentage of picture inLensity re~naining after preproeessing = 99%; percentage of piogure area remaining = 53%.

90

KLINGEI~

:kND

.X,~':x2x~XXXXXX:,X•162

I)YEP

KK.x:
<:,.X2~ <.X~'.K/>~ K C<:K~::~'.

X

X X ~C X X X X X X X X X X X X X X X • X • X X X X X X X X .,x, X X X X X >r X X X X X X X X X X X X X X X X X X X X X X X

9

;< X Y. .,.

.

X X X ""

.......

. ,

".__'• .......

'-k_L___'

. . . . . . . . "- ....

:•

~ "~ '~ ~ ~/~-' ~ N %'Y~~ ~; ~ ~ ~ ~

-- . . . . .

. . . . . .

~ ~ ~-~ ~i"~/~ ,'-:I '.i2, 7.. ~C;; "~.V 2,1~' ~ ~ :Z ~ % ". ~.~,: '~ Y, C-~,~

......

,d ~ ~ ~ ~,F~':', W -~-~~ ~: r

$#

~ :;':~~ -~ ~v';,i.@ ~r--~~

~ { ~ ~ 1 ~ ~ ~ ~ - ~ . ~ = ) = ~_ ~ ~_ ~ ~ #. ~ ~ : ~

~_ ~ . ~ ~ . ~

-

-

.

x

_~ ~,'~----.~_.T.-- ~

X

=). . . . . .

~r

~ T

X

............... X

.......................

.... ~.~#~.~g~ ~#-~r ~-~ ...... ---.T-- ~ ~ ~ n ~ ~. ~,.~ ~ ~ ~ ~ ....... _~. . . . . ~.i.~ ~ ~ ~ c: ~ ~ ~ ~. ~ ~ ~ ~ ~ ' ~ ~ ~ - :-.':.:"":,:L.-:

. . . . . . . . . . . . . . . . . . . . . . . . . . ..:. . . . . . x ~XX)eXXXA2CX?,X>C),2~,,'
X j__X

_. . . . . . . . . . . . . . . . .

-L". . . . . . . . . . . . . . . . . . . . . . . . . . . . . . -=-. -r- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .... ~- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4" ........... ~---................................ ................................

X

.

....................

X 9

X

.X )~

,

X X

"~.

~XXXxXX.~.KA>XXK

Fie. lc. Retained polyhedron pi~Lure. Parameters: w l = 0 . 1 0 , w2 = 0 . 2 5 , ~ = 0 . 0 5 , l e v e l = 1, neighbors = no. Percentage of p i c t u r e i n t ~ e n s i t y r e m a i n i n g ~ f t e r p r e p r o c e s s i n g = 9 3 % ; p e r c e n t s g e o f plo, t m ' e m ' e ~ r e m a i n i n g = 50%.

P I C ' r U I t E R E P R E S E N T A T I O N USIN(I I/ECtULAIt I ) E C O M P O S I T I O N

Xa,~a;~4XX~AX~A~XX)~'4XA~

91

~A~XA,4~


X

X X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

• •

.........

__2:J2-__--_---_2-_222-:2-222:-2:........ ~,-{ y.%,~r .; N -4 ,~ .~ ~ x .{ %-~.X %N . . . . . . ....... ~ ~c"x.x "~"c- ":. c '~, 'c.v.'~,',"~.%~ "(.n X. . . . .

X

....................... X X

Y



X A

x

X X"

x X

X x x X X

X

,X X X ,X

. . . .

x

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .

. . . . . . . . . . . . .

. . . . ~" . . .

. . . .

. . . .

. . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X X X A

::::::::::::::::::::::::::::::::

X

x .........

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Xx~,Xxx< XXxxxxxX>

.....

~;EG ION

.... 1--

O~;ADRANT

OLIADRANIF

9

....

,~

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

INT-hISITY

X M~N

X ~a~.X

P , O ,C

~95'

40

t~e~

.X3

r,.o,~,:,,A

47 50

3.3

3r'~

~q

5~. 5 :~

P,D,

U.,'~.C

I'. C..0 ~=-C .D

1 7 5 ~. .....................................

BT7 . . . . . . .

72 I~.C ,AoB .D ~.~ ,-

,<

( X;x '
,C

P,B ,D*A ~'.O ,D,C I ~ . ( ~ ,'/~ , C . . . . . . . . . . . . . . . . . . . . . . . . . . .

F).~, a ,D ~.A

.O

p.A

,B,C

P.A,e,o

t6/~"--

37

a.O

ag.

-I. 9

[7

a ~" . . . . . . . . . .

6a ............

17 . . . . .

e~I

~"

13

.36

13

II0 I I 2 l

oP. 160

17 ..........

y ' MAX

;33

6~1 I05f~ F~6 g7

Y_MIfl

17 25 9 ---~

o 17 q

9

3~ 2~ ....

33 ,~.0

4B

32 -3~'--

16 .|6 4~ 5~

"12 !.6 .............

"~'9 33 ..........

5~ ~O------

16 , ~,

~'I 1.7

~5 3~.

Ih 16

17

2~,

2"~

:12

Fro. 2. Region deseripbiolt aloe,' regular deeomposigiou of Fig. lb. (Note: blank areas have been ehminated from the pic:tm'e s~ructure,) Percentage of i)ieture intensity remaining after preproeeHsiug = 99%; percentage of picture area remaining = 53%.

92

K~LINGEI? AND [)YEll, OUA~AN~ SUUOUXOt, A ~ I S CANnIs CANDI C A T E CAf~O; C A T F CA;qDI C A T E

P.A P,B ~,C P.D

OUAD.A~r rico

S tJf3OUAO.~ J,N T 5

a.D.~

I)Ek E ' r R c

~.O,["

D~L F T C D e ~Gh,~On s P.O.U.A C~N~OATC P,tI.UeC C~tNO|O~TE QUADt~A~T P.O,E.C

S UBO~IADF; ANT ~,F p.o,e,coA Ce p.D,IS,C, e OeLETCC P.o.o.c,r LeAp p.Ooe.C.~ D~L~teC OUAO~ANV ~,O.O.A SUOaU^O~A~S p.o.~**,~ a.D~

eI'LET~.C fi

DrLCTEC

O u ~ o r A~T SUPOUADR ~ N T S P,c,~ ~ELETrD ~~ L~AF P*C,C D~L.ETeD P,C,O LEAF ~ C 1GH~m~ s I'.C,A,~ C&NO I D A T ~ P,C. AeD CANO|OATI~ OUAQRANT p.C,A.O s uOUU,~OAAN~rS P,C:eA~ crL~rr~

,.CoAoO. n Lc~P ~.C.A,O,C ULLETeC a,c,A.o.o LeA~ NEZGhUORS OUAD~ANT PeC.Aln SUaO~R~N~

S

P.O.A.i]eA DE~LETPC D,C,A,n.D ~,~AP p.C.A,~,C DEt.~TeC P*C,A,~*D LEAF N e 1OH ooA s QUADRANT P'EI SUflQUADRANT~ ~,a,A CAN~IDAt'E P*~,B L)ELETED P,B,C LL~AF P.H.O r162 NJ~ I G;ABCIk S OUAON^NT PlgI~US~AOAAN~

LEAF

~.rS.D.A

s

P.BeO.~ OELETE• =eBIO,C ~.EAF p,o,a.o neLETEn H e IGt~eIOeS ~U~O~NT P'O'~U~SAOn~NrS P*B*A,A OELF-TEO PIBIAIB OEL~TEO P.E.A.r LEAF P,I~,AwO LEAF NEZOmO~ OU~nrXH't P,A 5 U B Q ~ A D ~ AN~' ~ P,A.A DELETED P,A,B CANDICATE ~,A,r CANCIC^TE P.A,D LEAF NEIGHBORS OU~D~AN~ P e A " C UBOUAI)I~ ANT $ P.AeC.A DELETED

P.AIC.fl p.AoC,r

~,A,C,O

LEAF OCL~tEO

LEAF

~EIG~FmRS OUAO~ANT

PeA~UDQUAD~NtS ~.A,B.A DELrTEO P,Ao~.U DEL~t~O ~,A,u.c L~AF P.A,D.O LEAF ~elGHaO~S

F~a, 3. Traea of regular deeomposigion for Fig. 1

PTCTUI~.E

R E P I ~ E S E N T A T I O N

USING

REGULA[~

93

D E C O M P O S I T I O N

XX•215215215215215215 X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . -. ... ... . . . . . . . . . . X . . . . . . . . •

-- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X X

-- . . . . . . . . . . . . . . . . . . . . . . . .

~-

................................................................ X X .................................. ----........................ X X .............................................................. ~--X X )~ X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X. . . . . . . . . . . . . ~ % ~ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X............. X X. . . . . . . . . . . . .

M~M~ ....... ~ @ ~.~Y.~I~ ' ~ . ~ MMm@~%?~@--~%XW@~ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X

X . . . . . . . . . . . . . X. . . . . . . . . . . . .

~ M ~ % % ~ % % X % @ ~

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - . . . . . . . . . . . . .

X

M ~ % ~ % @ ~ % @ ~ @

X. . . . . . . . . . . . .

MM~%~X%~gw~%%%@~@@

. . . . . . . . . .

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X .................................................. X .... ~ ............................................. X' .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

.

.

.

.

~

X ..................................................

.

.

X X

X

.

X

. . . . . . . .

X X

~ ~

X

%

X

~

.

~

X~M~@@@ X~%~

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

-

X

X

~ % ~ X ~ @ ~ . . . . % % ~ @ ~ .... %~%@@~-------X ~ - ~ - ' - - - - - - ~

~ % X ~ @ ~ @ ~

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X .

X

-- . . . . . . . . . . . M M M ~ - - - - - X ~ X % ~ @ ~ - % ~ % ~ . . . . . . . . . . . . . . . . . . . . . MMMM~@~@~ . . . . ~ % ~ . . . . %~ . . . . . . . . . . . . . . . . . . . . . . %X~X@@@~@ . . . .

X . . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . .

X.

X

.

.

.......

X X

- . . . . . . . . . . . . . .

X

.

)r . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

-X-

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X X

X . . . . . .

-

-

'

X

~

)

~

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . . . . . ~ u 1 6 5 1 . 6. 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X X X

X. . . . . . . . . . . . . . . . . ~u . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . . ~ u 1 6 5 1 6 . 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . ~ u 1 6 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X X X

X

u

• . . . . . . . . . . . .

x. . . . . . . . . . . x . . . . . . . .

u ~

- - - - ~

1

1 6

6 5

u

1

5 6 . 5. .1 . 6 . . 5. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .



X X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X

X

X

. . . . . . . . . . . . . . .

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X"

X

x . . . . . . . . . . . . . . . . . . . . . . . . . . .

x

~ . . . . . . . . . . . .

- . . . . . . . . . . . . . . . . . . . . . . . .

XXXXXXX~•215215215215

FIa.

4a.

Digitized

picture

of

a

blocks

world

scene.

!)4

I~LIN(,I,,h. AN1) I)YI~II,

X

. . . . . . . . . . . . . . . .

X X X. . . . . . . . . . . . . . . . . . X \ .K

. . . . . . . . . . . . . . . .

?:

. . . . . . . . . . . . . . . .

- . - ; T. . . . . . . . . . . . .

- . F,,'-.'=,- T',--. , . . . . . . . . . . . . . . . -, . . . . . . . . .

. . . . . . . . . . . . . .

- . - - --.,-- . . . . . . . . . . .

x .4

),, .........................

y,

. . . . . . . . . . . . . . . . . . . . . . . .

x

. . . . .

X X

X ' M : ~'H . . . . . . . . . . . . . .

X

X

................

= ~Y~,~~ [ ~ ' W .

"T~

'~" ~'T ~ ~ ~ - W ' ~

X

- % '~ ';. ~ " : ~ ~ ~ - " ;

X

- '.;~: ,". ?'~ ; ~ .

.

.

.

.

.

.

.

.

.

.

.

x

.

.

.

.

.

.

.

.

X

.

.

.

.

.

.

X

.

.

.

.

.

.

.

- . . . . . . . . . . . . . . .

,,~,~X~1 ~J F7 .

.

X

.

~=---

.

.

.

. .

.

. .

. .

.

. .

.

.

.

.

.

.

. .

.

.

.

.

. .

. .

. .

.

. .

.

.

.

.

.

.

.

Y : : ~; : ~ @ W

.

.

.

.

.

x X

~,/

X

. . . .

X X X

x

. . . . . . . . . . . . . . . .X - ......................

.=,-..:_-.-...---_._ ---,--=

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

--.x.

~K

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X

. . . . . . . . . . . . . .

X, X

. -

X

.

.

. -

.

.

.

.

.

.

X

. . . .

x

---

~ e ~ . ~ @ ~

~

x

---~

~#-~ @~

X. . . . . X X K y. 9

- . . . . . . .

.

. .

.

4-'~'~

.

.

. .

.

.

. .

.

. . . . . . . ~ ~ ~ $ ~#~

.

.

.

.

--'~

@ ~:~ ~

.

.

. .

- - - :~ :) .~ 9 :~ @~ . . . .

x

.

.

• -

.

.

. .

X ~,:

.

.

.

.

X

X x X

-.~x

:~ ;~ ~; '.'.,~~ ~1~ ~ .~ ~ . . . . %'~: '; ~ ~lI.~ ~ ~ . . . . . x " , ~, ~ . . ~ . . . . .

.

.

.

~ ~1~--

~ .!: ",~ %' ~ . ~ l @ @W . . . .

.

.

.

N~M,~/,~ ~ .



.

.

. .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

. . . . . . . . . . . . . . . . . . . . - - =~.-...-- . . . . . . . . . .

.

.

.

.

.

. .

.

.

.

. .

. .

.

.

. .

.

.

. .

.

.

.

.

. .

. .

.

.

. .

.

.

. .

.

.

.

. .

. .

.

.

. .

x X ----X

.

.

. .

. .

.

. .

.

.

.

.

.

. .

.

.

. .

.

. .

.

.

.

.X

. .

x

X

,~ , ~ I ~ V~. . . . . .

x

~ ~ .~ , ~ : ~ . . . . . . . . . .

. . .

x X

. . . . . . . . . . . . .

. . . . . . . . . . . . . . .

- : ~_--.-: -_ . . . . . . . . . . . . . . . . . . . . . . . . . . . .

- --:_-.:_-~ . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. .

X

.

.

. .

X. X .

x

....

x X X X X. 9

X

X X

Xx.yYXXXXXXXXX),'•215215

Fro. 4b. Retained blocks scene: neighbor algorithm, p a r a m e t e r s w~ = 0.10, w2 = 10.25~ e = 0.05, level = 1, neighbors = yes. Percentage of picture intensity remaiMng after preprocessing -- 100~o; perceni,age of picture area remaining = 3 8 ~ .

PICTURE

~
ltEP]tESENTATION

>r x • • < ;, .-'~.;~. L~ ~ .~ ', :': x • ....

X

x~

USING REGULAI~ DECOMPOSITION

.v.5h:,~#xy, x,," ~ > >:, i XW, 4;%x.4 x Y,~ ~'Y,~#_.x~'x $_g_X "#

x f..x :,, ^ , . . ' < x

~ . . . . . . . . . . .

.

....

-

-

-

-

T --":.-

"t."_.T

-

X

X

T--

t'L" .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

"7,

>. /, }.

. . . . . . . . . . . . . . . . ?'~" . . . . . . . . . . . . . .

X

~

X

@~- . . . . . . .

........................

. . . . . . . . .

X X

r~ . . . .

% t~. I , ] ~ -

--

X

~.'_/, "4..,", .',""i, ~ - "~.~ ~/~~J_N~ _- r . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

;" ~ ,~: ",w . . . . . . . . . . . -..-- ----. --_= -__-~ - .-_= r - . - r r . . . . . . . . . . . .

x

x ........................... x

X X

X

.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

K X

....................

. .

. .

.

. .

.

. .

.

. .

.

. .

. .

. .

.

.

. .

. .

.

.

. .

. .

.

.

- . --. -...% .- --. - - .-------T .- .-_.-_ .---.

X

.

.

X

.

.

.

.

.

.

.

.

.

.

.

.

. .

.

. .

. .

.

. .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

X ~.",~,~@@ ,%,% f4 % @ @ ~

.

.

.

. ~. ~. ~4 t~ ~ l , a ~ ~ -..-. ~ ;L~._X ~ 9 . ~ _ ~

. .

.

.

.

.~

~ g ~@

....

x

-~r. ....

x

;., X X

....

~ :i@,&'~ @~@ . . . .

"c:=_I"A?,'.:~.@~_@~.-.-.--X

. . . . . . . . . . .

.

.

. .

%y.~,@@~@~

.

.

....

x

"; % Y. ,~'.~ @ ~ '~l ,~ . . . . .

X

X X K

--"~,%

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

- : - : -. . . . . . . . . . . . . .

X X X

................

% Y,W; . . . . . . . . .

X

. . . . . . . . . . . . . . . .X . . . . . . . . . . . . . . . .X

,4 X

. . . . . . . . . . . . . . . . ;r . . . . . . . . . . . . . . . . .,<

X X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X

. . . . . . . . . . . . . . . .

X

--

X X

- - .~ :C,:':)'~~ r ~ . - ~. @- ~ ' ~ ~: @ ~ ~ @ @ .

.......

x

x .x. .............



.

.

.

I=~::i= -~' ~ . ) : @ - - - ,

-~:,~r

- - .T- :---_": *~ ----~

x

-

.

~1~r~.~e

. . . . . .

.

X

.

.

.

.

. .

.

.

. .

.

. . .

. . .

. . .

. . .

.

. . .

. .

.

.

. .

. .

.

. .

.

.

.

. .

. .

.

. .

.

.

X .

.

. .

. .

. .

.

. X. - X.

x x X

..

. . . . . .

,~ .:)=.:): ~-')~ '~ ~ r-.}~ . . . . . . .

~ ~ ~ : ~ ~'@~raw

.

~ ~:~#~;~ r

. . .

X X

'- . . . . . . . . . . . . . . . X

.

.

.

.

.

.

.

.

X

.

. . . . . . . . . . . . .

X

. . . . . . . . . ~- . . . . . , . . . . . . . . . . . . . . . . . . . . . . . .

.x

X X X X

, X

. . . . . . . . . . . . . . . . ..................

X )< X X "<

.x

9,5

x X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . '

X

~A X XXXX

X • X XX XX)4

X X

- . . . . . . . . . . . . . .

F

. . . . . . . . . . . . . . . . . . . . . . . T .......

"i

X X A X ;K : < X X N • X X X X X X

X .

X

X

XX •

X KXXXXXX

XXXXXX



XXA

F r o . 4c. Re~alned bloolcs scene: original algorithm. Parameters: w~ = 0.10, w2 = 0.25, ~ = 0.05, level = 1~ neighbors = no. Percentage of picture ingensity remaining after preproeessing = 94% ; percenl;age of picgure arel~ l'em~ining = 36%.

,9(3

I s ' L I N G E R A N D DY~1:~

x

X Z

x \

,K ,(

:(

.

X X

.

.

.

.

.

.

"

........

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .: ~; ,4 . : ~ . . . . . . . . . . . . . .

~

_ . . . . . . . . . . . . . . . . . . . . . . .

:.

'

9

~ 7. ~

@

~

W

;% K

~

x

x



,~. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

_..:.-~,'i.C~ "~,~ ~._~B--_- . . . . . .

X X: ":<" . . . . X X )C K

- ...... . . . . . . . . . . . . . . . . . . . . . . . " .-

. . . -

. . . .

. . . .

. . . .

. . . .

. . . .

. . . .



. . . .

X X X

X

...............

9 . . . . . . . . . . . . . . . .

X X

.....................................................

X

. . . . . . . . . . . . . . . .

X

. . . . . . . . . . . . . . . .

X

. . . . . . . . . . . . . . . .

X

. . . . . . . . . . . . . . . .

X •

-----$

:): :)::~ ~ ~ "C ':~3 . . . . " : r #~:ga . . . . . . 9

X

x x X .........................

X

--:~

#~'#r

~-"~

X

X 'X

. . . . . . . .



X

X

• X X X

X X • X X X X

j

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.: X ~. x, X X .CX X X X X K:~ X , , y X X X XX;..:~:,'.: x, X X :','K,w, X>, >. X X X •

X X X X ;~ •

XX X X XXX X•



X X XX X X XX

FIG. 4d. R e t a ~ e d blocks ~ceae: different startin~ level. I~arameters; wl = 0.10, w~ ~ 0.25, = 0.05, level = 2, neighbors = yes. Percelttage of p i c t u r e int,ensit.y r e m a i r d n g after p r e p r o c e s s i n g = 100~ p e r c e n t a g e of picture ~rea remaining = 22~

PICTURE

REPRESENTATION

97

USING REGULAI~ DECOMPOSITION

XRXX~:XXXXxXXX~XXX~XXXXXXXXXXXXXXXXXXXNKXXXXXXXXXX~XXXX•215 X R

X .X

X X

9 X -'"~"X

..........................................................

X K X.

:.........................

~

" .

.

.

.

9

.

.

.

.

'

" - - = - ~ =

X

XXXX~

-=

.

.

.

.

.

.

.

.

.

.

~

. . . . . . . . . . . . . . . .

X X "X x:"



. . . . . . . . . ,

,

X

"X

x

-v~l@@--

--XX@~---

x ..........

-~NXX~XXWW~XX%~WWW-

.X X X .

.

.

.

- -

.

.

.

.......

--~XX

~

X X

x

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

X X

.

.

.

.

.

@

. . . . . . . . . . .

--

-:=.:=:~=-=-==~

.....

.

.

.

.

.

--X

.

~

~.. X ~ ~

~1~-----x . . . .

x

@~@R-_~-==

X

~ . x % ~ # ~ f~ ~ I # . . . .

X

=.~ % ~ X X ~

@~I~

. . . . . . . . . . . . . . . . ----. . . . . . . . . . . . . . - -/,

} ................................. 3=~Z:ZZ ~ C ~ 5 Z - - 5 5 ~ ...... k

~

...........

. . . . . . . . . . . . . . . .

X

. . . . . . . . .

_;- X % X ~C~,,3~ ~ @ W .~=-z.:_-X

-- . . . . . .

X X

. . . . . . . . . . . . . . . ~X . . . . . . . . . . . . . . . . x

x X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X .................................................................... X X X X X

.

'A ~ M ~ - - - - ~ X

. . . . . . . . . . . . . . . . -- . . . . . . .

9

>( x

-."7.-- T .-...-. -T.-" -.F : : - " "T.-x

. . . . . . . . . . . . . . . . "K . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . )K

.. . . . . . . . . . . . .

. . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . ,.( . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . X X X X .X

X 9 9

,, X

'

,

,

X

,.

. . . . . . . . . . . . . . . .

.X " X X"

* ",

X 'X

"XXX.~ X XXXXXXX~
- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ' - . . . . . . . ,. . . . . . . .

". .."

~ ~XXXXXXXXXXXX

• X • X

'

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

XXXXX

X X

',

X

"X XXX•

XXXXXXXXXXXXXX

XXXXX)<

XXXX

XX XXX

F r o . 4e. l~ehabled blocks scene: threshold 2 raised. P a n , m e t e r s : wl = 0.10, w2 = 0.50, e = 0.05, l e v e l = 1, ~eighbors = yes. Peroentage of picture in~ensi~y remaining after preprocessing -- 1 0 0 ~ ; p e r ( : e n t a g e of p i c t u r c ~rea remaining = 32%,

98

KLINGEI~ A N D DYEIr

~,"

X

v, X

:. . . . . . .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.. . . . . . . . . .

X •

-

X



Y ,:

- ~

e K

X X

%,,,~ . . . . . . . . . . . . . .

- ~.' !,,"4 ~ W . . . . . . . . .

X X •

'~ . . . .

....

K •

X •



. . . .



.K

. . . . . . . . . . . . . . . .

X

X X

....... .....

MMMMN .... M M .I MI,.]W f~ . . . .

X X

x

- - ~

~,~W W W W ~ - - ~ - X

-":F

T-?'----

. . . . . .

........

K

--

"X

.

.

.

.

.

.

% :~; ,% ~.W~r WW W ~ - -

--~- X

- - ~,,Y. ~ % @ @

@ @ @ @ -:::'~-- X

x

- - - %~.,% Y . @ ~

@ .......

X

.........

,• . . . . . . . . . . . . . .

................ - ; : . . _ _ .~ _ _ _ ~- j--

:. . . . .

l

..................

........

"K .

.

.

.

.

.

.

.

.

.

.

9

,.

X

X =.:.~. _ ~ ~ -

................

...' ..............

. . .

.

X 'K 'K X X X X x x x X X x •

X

-- . . . . .

-"~~ ~'. . . . . . . . . . . . . . . . . .

"z . . . . . .

"Y'C ; " . . . . . . . . . . . .

- '--" " = " - - ~ X " . . . X . . . X . ---X

. . . . . . . . . . . . . . . . . . . . . . . . . . . - . . . . . . . . . . . . . . . . . . . . .........................

X X X X X x

. . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . .

• X, x x X

:&..."

x X X

......

X X X . . . . . . . . . . . . . . . . 9

x ~ X

.

.

:< y XX X ~ x;~.x ;,, x'x',';x x x x x '< X >-Xx XXX X XXXX XXXXXX X X •

.

9

,

_...x

XX XXXX,X XX X X X X X X X XXX XX X X X

FIG, 4f. l~etained blocks scene: b o t h thresholds r~ised. Parameters: wl = 0.20, w 2 = 0.40, = 0.10, level = 1, neighbors = yes. Percentage of picture intelL~i(~y rem~Uning after preprocessing = 100%; percentage of picture a.re~ remaining = 3 3 ~ .

PICTURE It,EPRESENTATION USING REGULAIt DECOMPOSITION

Fins. 5, 6, 7. Regious found in Fig. 4b. x

x

:

1 x

x x

....... .....

x x

-.+

..

..

.' .-,l----x vMv~lee----4

m++ smml. +~ + ,+ l14m+ ~ ~ Iml

9 ....

....

x ,

.... x

~ t m ~ l e X

--++~+~ ..... ................

v---x x

................ x

x

i

................ ................

I x

................

x

................ ................

l x

1

"

x

x ~

1

I

++l;+):'~+ h+;',.+ Ty X +I'+ + ++^ + m:< + + ~ + ~ ' * " * + - ~ + ' + ~ r ~ + ~ . + - + + ~ + ~ ' + ~ + ~ + + ~ P + + ~ ' ~ + ~ + + ~ * ` ~ + ~ + ~ + + + ~ ' ~ * " + + m ~ + ~ + ~ r,.+..+

7+~

]?

++

,,

~i ++

Fro. 5. Region ] descriptioa after regular dccomposit;ioa of Fig. 4b.

99

KLINGEIZ AND DYER

i00

I x

"

1

x

~

I

"x

:

1

1 x : :

Ix 1 I

:

.

.

.

9 x

...... .....

x

....

.

.

.

.

.

- - - , , * , ......

tt)tttl~I~V4 t~t~l~t~ee~ )**$t$I~

I

...... .......

.........

--.41)iIIil . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

~.

~*Ieem*W~)I*I*~ItI.~IIt~e~e#~**tI*I*k~-~*s~I~I~**mI;M*`**It*I~I~I~pI~*~*~I~v

Fro. 6. [~egiGn 2 description after regular decomposition of Fig, 4b,

4

PICTURE

REPRESENTATION

USING

REGULAII

DECOIVfPOSITION

x x

.....

x

~.,,le

.......

,.(,;N~

- v l , ~ ~ ,~. ~, ,, ~ o w . - r

--

Du4 4l-

x x

x

x

x x x x

x

x

1

.

1

*

]0]

]02

K L I N G E R AND DYEII. I'',;l,:::t

T~q

< , ~ x < . ' < '1 '~ ~:*;< ~ / X X K X < X ~ Z X A : < ~ :4 < X A • N. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ~'. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x - - ~ * ~ ~4~N x - - % ~ u

....................... ........... ~ . . . . . . . . . .

>--r x - - ~ - - - ~ •

......... ........ .......

X - - ~ $ ~

. . . . . . .

x--~$9---~

........ . . . . . . .

•162

x - - * ~ - - ~

.........

x - - ~ * ~ x--~9@~

~

~ ~

r

........ -

-

~

~-----~--. . . . . .

~*~u

X X x

x--***--'~*-

x

r

- 4 . ~ - - - ~ C: ~ - - - %*----F9"4--~' ...... ~ - -

x x

.~-

-~.~ . . . .

~-

--*

x

X

x--9~ ~---#x - - ~ f~ ~ - - ~-

x--*

X x •

x--#*~--$

X ................................ X ................................ x ................................



X X X X

................................ ................................ ................................ ................................ ................................ X ................................. ................................

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X .................................. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . XX~X•215215215

x X X X X X X X X .X X

. . . .

X

. . . .

X X

X

~---

x--#***%*'~ x - - t~*.~-1:~:~ ~ x. . . . . . . x. . . . . . . x x X

x x X

X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X .................................

~-#t,9@

>, ....

--~

X

........... ~ .......... .......................

~ I C TI;',:'!:[

X ; < X ;< :~ / X X X z. X X X X ;< :,:,: ;4 ~ 74 ;,: :;(;( X X ;< X X S Y,.:< X / ~ < ;,; X

.......

........

I':

x. . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . . x--* % ~,?~% . . . . . . . . .

X

...... ~# . . . . . .

%9 .... ~% . . . . . . . ~ - - - ~ . . . . .

lr

~*

~

).

~,

u

~%~%%

....

>:

. . . . . -~$-. . . . . . . . . . . . . . . . . . . . . . . . . . .



x x y ), > x

x • X X

), x :. z

X • x X

/

S

:

X

XXXXX

XXXXX:
XXXXXX

XX •

x • 4 A ~ X'~,X X ~ X

Fro. 8a. Digil;ized picimre of alphabetic ehartu.r B and O. F I e . 8b. Retained alphabetic oha,'acters--aeigl~bor Algorithnl. P a r a m e t e r s : w, = 0.:1{], 'w~ = 0.25, ~ = 0.05, level = 1, neighbors = yes. Perze,~age of pi(H,ure imxmsil;y remaining ~t'ter prepro~essing = 100%; percentage of pi~;tm'e area remaining = 30~o. "file

EXT~-tAr..T,?!E,

I'~I,~TUC'K

"YI!E

X• •215 XXX ~X~,.•215215 x. . . . . . . . . . . . . . . . X. . . . . . . . . . . . . . . . X - - :~ 4.:1~~ I,:"4,: • :~-~ # - - - 9

.... --'%



~::'~ . . . . . .

~•

Xy• X x

X•215 X X

vx'rF.~&C'IKO

I']',:1.

:'r,'

X X X X • 2 1 5 ;,:~ X %,.~ x x > ........ ---7 .....

~--

x

X

:~#}%r

,~

X--

# :): ;~ u :~*--t

~:):-- # . . . . . .

.x

;x--

4~ # ~ . - } : , ~ # - -

,

X

X--***---r

x--

~.~,---#

.

--- % ~ I ~ - I ~ ,

x

X--

~=$--~,

. . . . .

:,, x X x X X X X

x X X X X X X X

k

X X



X , ~ • X.'
1:~ I 1 : } r :} 'I: . . . .

>: • A

x--##*

x--

.

;',::L~ .

.

.

.

.

x--~--~=

.

.

=~:~ . . . .

~ :~ ~ - -

x . . . . . . . . . . . . . . . . • . . . . . . . . . . . . . . . . X • X

X x • X X x X X X X X X

X X X X X X ~t X X X A X X •

u

x

X X ,'.; > X X X X ~

X X X X;" ~ Y.:~ X X X X XX X X X

X X X X X X X X

.

........ . . . . . . . .

.

.

4* ...... .

.

.

.

I~* . . . .

r~ : ) : - :~~ - - -

X

~.# . . . .

x

.%:~ - ~ _ _ , . ~ , ##,

>.

~___



x • •

• x y k

:,, :,,: X

x X X X :,< :,, X x

X:4',(XXXXXXXXXXXX~Xt4X~,X;~AX;~:,;XX,~;,,XXXX

Fie. 8c. P~e~ahmd alphabetic characters--original algorithm. Parameters: w~ = 0.]0, w~ = 0.25, e --- 0.05, level = 1, neighbors = no. percentage of picture i~Lensity remai,i,~g after preprocessing = 82%; percentage of picblre area remainh~g = 25%. Fro. 8d. Retained alphabetiu characters--different stm'ting level, l~aralnet.ers: w, = 0.10, w2 = 0.25, e = 0.05, level = 2, neighbors -- yes. Percentage of picture lid;easily remaining afl;er prepror = 1 0 0 ~ ; percentage of picture :wea remaining = 2 7 ~ .

P [ C T U R I ] REI?I{I~,SENTATION USIN(} I}.E(.~I ' f LAII, D E C O M P O S I T I O N

Tilt':

E X T t : : , " ~ " l',r'.:. , I ' I C T I J ! - , - :

T;.F:~

F.,",TP:~',.CIE!I:'

103

-'][CTUR~

X • X # Y X • X X X >: :z ;~:Y • X X N • X • • >: >'tO', X >" ;'; '.< ', YX x x x ~ 4~ =b --)=~- ~ ~ --

": >: x >~

X t,%"(X Y. X X X X Y X ;{ Xt-:.:< X ,',f X • X X X X X,~ X X ;,r ~'X X X X t,: . . . . . . . . . . . . . . . . . ,< . . . . . . . . . . . . . . . . \--L~ ~ ~.~ ~ ........

X M X ;,:

X

~

;';-- ~ r

X

X 'X

X X

• x

< X

X X

X. X

X X ,X X ,< X & y, X X X X X •

X X X X X • X X ;'; )': X

X x X X X X X X X • X

~:):

~ -

~XY•215215215215

--

:h "1~$: ~ ~ )~

X Y

;':

. .

1~ -

. .

.

. .

. .

. .

.

--

.

.

.

.

. .

. .

. .

.

.

.

.

.

~ :~ ~'-'): $, ~=

. .

.

.

x. y,

2{

A X Y, X X X x X X X x x' X ;'!. X X Y. X ,,: x :,: X X X '4 X '4 X X 4 x ;< X X :'( X X :'f K ~ X >:• :< K X .~tXX X>: X

FIG. 8e. l{eh~ined t~lplmbet,ic chart~ctem: threshold 2 raised. Par~uneters: wt = 0.10, w~ = 0.50, e = 0.05, level = 1, neighbors = yen. Per(~enl~age of pir inl:ensity tff[;er preproeessing = ] 00%; picture area remaining = 14%. FIG. 8f. P~etained alphubetio ch~raeters: both thresholds raised. Parameters: w~=0.20, w2 = 0A0, ~ = 0.10, level = 1, neighbors = yes. Percentage of picture intensii;y remaining ,~fter preprocessing = 96%; percent,age of pichlre area rem,~ining = 28%.

104

I~INGER

AND

DYEI~

REFERENCES

]. A. K[inger, Data structures and pattern recognition, in Proceedings of the Firth Inlernational Joint Conference on Pattern Recogailion, W~hington, D. C., 73CH0 821-9C; IEEE, New York, 1973. 2. A. I~tinger, Pattern recognition programs with level adaptation, in Proceedings of the 1973 I E E E Confcrence on Decision and Control, San Diego, 73CHO 806-OSh,IC, IEEE, New York, December 1973. 3. J. :E. Warnoek, A hidden surface algorithm for computer generated halftone pictures, Computer Science Department, University of Utah, TR 4-15, June 1969. 4. I. E. Sutherland, R. F. Sproul, and R. A. Schumacker, A characterization of ten hiddeti-surface algorithms, AC~[ Computing Surveys 6, No. 1, March 1974. 5. W. ~[. Newman and P~. ~'. ~pro~, Principles of Interactive Co~nputcr G~.c~pJdcs,McCn'aw-I-Ii~l, New York, 1973. 6. C. I~L Eastman, Representatio~m for space planning, Comm. ACM 13, No. 4, April 1970. 7. C. A. Rosen and N. J. Nilsson, Application of intelligent automata to reconnaissaI~ce, SRI Project 5953, D~eember 1967. 8. A. Rosenfeld, Picture Processing by Comp~der, Academic Press, New York, 1969. 9. A. Rosenfetd, Non-purposive perception in computer visioll, Computer Science Center, University of ~[arylaud, TR-219, 1973. 10. A. Kosenfeid, Adjacency in digital pictures, Computer S~ience Center, Uuiversity of h,[aryl~md, TR-203, October 1972. 11. A. [~oseafeld, Figure extraction, in Automatic Interpretation and Class~eation of lmages, (A. Grasselli, Ed.), Academic Press, New York, 1969. 12. K. FLt and B. K. Bhargava, Tree systems for syntactic patte~u~ recogniti(m, IEEE Trans. Compulers C-22, No. 12, December 1973. 13. R.. D. Merrill, Representation of contours and regiorLs for egiclent computer search, Comm. AC~[ 15, No. 2, February 1973. 14. A. Kosenfeid, Progress in picture processing: 1969-1971, ACM Co~nputing Survc?ls 5, N(). 2, June 1973. 15. C. M. Eastman and C. I. Yessios, An elficient algorithm for finding the union, intersection ,rod differences of spatial domains, Department of Computer Science, Carnegie-Mellon University, September 1972. 16. C. ~[. Eastman, Heuristic algorithms for atttomated space planning, in Proceedings of the Second Joint International Conference on Artificial Inlelligenee, Imperial College, London, 1971. [7. U. M:ontanari, Networks of constraints: Fundamental properties and applications to pictnre processing, Department of Computer Science, Carnegie-Melon University, January 1971. 18. IJ[. Y. F. Feng and T. Pavfiidis, Analysis of complex shapes in terms of simpler ones : Feature generation for syntactic pa~tern recognition, Departmen~ of Electrical Engineering, Princeton University, T1~-149, April 1974. 19. I-L Y. F. Feng and T. Pavilidis, The generation of polygonal outlines of objects from gray level pictures, Department of Electrical Engineering, Princeton University, TR-]50, April 1974. 20. R. A. Kitsch, Resyathesis of biological images front tree-structm'ed decomposition da~a, in Graphic Languages (F. Nake and A. Rosenfsld, Eds.), North-I=Iollaud, Amsterdam, 1972. 21. J. 'Freeman~ The modelling of spatial relations, Computer Science Center, University of Maryland, TR-281, December 1973. 22. O. Firschein and ~ . A. Fischler, Describing and abstracting pictorial structures, Pattern Recognilion 3, No. 4, November 1971. 23. O. Firsehein and l%i. A. Fisehler, A s~udy in descriptive representation of pictorial data, Paltern Recognition 4, No. 4, December 1972. 24. T. Pavilid~.% AnMysis of set patterns, Pattern I~ecognition 1, I~lovember 1968. 25. T. Pavilidis, Structural pattern recognition: Primitives and ~uxtaposition relations, ia Frontiers of Patt~n R,evogn~ion (S. Watanabe, Ed.), Academic Press, New York, 1972.

PICTUI~E REPRESENTATION USING REGULAR DECOMPOSITION

105

26. A. Koscnfeld, Connectivity in Digital Pictures, d. Assoc. Comput Mach. 17, No. 1, January 1970. 27. J. P. Mylopoulos and T. Pavilidis, On the topological properties of quantized spaces. 1. The notion of demension. II. Connectivity and order of connectivity, J. Assoc. Comput. Mach. 18, No. 2, April 1971. 28. I L L . Gregory, Eye and Brain, l-VfcGraw-Hill, New York~ 1973. 29. R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis~ Wiley, 1973. 30. L. G. Roberts, Machine Perception of Threo Dimensional Solids, in "Optical and :ElectroOptical Information Processing" (Tippett, Ed.), MIT Press, Cambridge, Mass., 1965. 3]. A. Guzman, Computer recognition of three dimensional objects in a visual scene, Department of Electrical Engineering, Massachusetts Institute of Technology, MAC-TR-59, 1968. 32. G. FMk, Computer interpretation of imperfect line data as in a three dimensional scene, Dep~r~meat of Computer Scieneej Stanford University, AIM 139, 1970. 33. M. Minsky and S. Papert, Project MAC Progress Report IV, MIT Press, Cambridge, Mass., 1967. 34. C. 1%.Brlce and C. L. Fennema, Scene analysis using regions, Artificial Inlelligence J., I~ No. 3, 1970. 35. Y. Yakimovsky, Scene anMy~is using a semantic base for region growing, Department of Computer Science, Stanford University, AIM 209, 1973. 36. L. D. I-I~rmon, The recognition of faces, Scientific American 229, No. 5, November 1973. 37. I). E. Knuth, The Art of Computer Programming: Fundamental Algorithms, Vol. 1, AddisonWesley, Menlo Park, CaliL, 1973. 38. A. Klingcr, Patterns and search statistics, in Optimizing Melhods in. Statistics (J. S. Rustagi, ]~d.), Academic Press, New York, 1971. 39. M. Rhodes, private communication, 1974. 40. A. Klinger, A. Kochman~ and N. Alexandridisp Computer anMysis of chromosome patterns: Feature encoding for flexible decision making, I E E E Trans. Computers, C-20~ No. 9, September 1971, 4l. J. Omolayole, private communication, 1974. 42. C. Dyer, private communication, 197~. 43. A. Klinger, "Regular decomposition and picture structure, in Proceedings of lhe 1974 Intm'national Conference on ~.gystems, Man, and Cybernetics, Dallas, Texas, 74CH0 908-48MC, IEEE, New York, 1974. 44:. A. Klinger and C. 1%.Dyer, Experiments on picture representation using regular decomposition, Computer Science Depm'tment, University of California, Los Angeles, UCLA-ENG-7494, December 1974.