Pattern Recognition
PergamonPress 1973. Vol. 5, lap.303-321. Printedin Great Britain
Machine Recognition of Printed Chinese Characters Via Transformation Algorithms PAUL
P. W A N G
and ROBERT
C. S H I A U
Department of Electrical Engineering, School of Engineering, Duke University, Durham, N. Carolina 27706, U.S.A. Abstract--This paper presents some novel results concerning the recognition of single-font printed Chinese characters via the transformation algorithms of Fourier, Hadaraard, and Rapid. The new design philosophy of a three-stage structure is believed to offer at least a suboptimal search strategy for recognizing printed Chinese characters with a dictionary of 7000-8000 characters. The transformation algorithms discussed in this paper will be used in the last two stages. Extensive experiments and simulations concerning feature extraction and noisy or abnormal pattern recognition have been carded out (the simulations have been restricted to a 63-character subset called "Radicals"). Comparison has been made of all three transforms according to their ability to recognize characters. Chinese ideographs Pattern recognition Fourier transform Hadamard transform
Optical character reader Classification Rapid transform Topological property
I. INTRODUCTION THE URGENTdemand of machine translation of Chinese is easily recognized and has been well-documented.(1) The Indo-Chinese group of nations, with a population of about 850 million, is currently publishing about 3 billion words a year. Considerably less than l per cent of this vast output is being translated and republished in English, French or German. (2) Automatic translation is necessary because human translators cannot handle the volume and because it is extremely expensive to train highly skilled translators. It is impossible within the scope of this paper to discuss the difficulties involved in building a successful automatic translation machine. However, the value of utilizing a digital computer in handling Chinese characters is clear even for those applications with only a limited objective. Previous attempts have demonstrated that some systems already developed can be used in performing various tasks. (3-I~) For instance, the accumulated research efforts in Japan concerning similar problems have been neatly summarized in a special issue of the Journal of Fu~rrsu ts) which dealt with the matter of KANn information processing. But it is accurate to say that the progress in KANJI research does not necessarily apply to the problem of Chinese ideographs. The utilization of digital computers in processing Chinese characters is not the only problem of the hardware system-design. There are some very fundamental questions which must be answered before an optimal and successful system can be designed. For example, there is the problem of the analysis of the characters themselves in terms of structure, grammar (~2-1~) or topological properties. (9) It is rather surprising that even today there exists no good method of forming a lexicographical ordering of Chinese ideographs for a complete dictionary. This problem is closely related to the problem of * Supported by Duke University Research Council Regular Grant 453-3209-6032-22302. 303
304
PAUL P. WANO and ROBERT C. SHIAU
encoding and decoding Chinese characters, and there are some efforts already being undertaken in this direction. ~9"18-22~ So far, there are two quite distinct directions of research concerning the recognition of Chinese characters by use of digital computers. STALLINGShas used a computer to analyze and describe the Chinese characters ;~2~.z2~the result of his research was a code for each character. CASEYand NAGYt24J take a somewhat different approach to solve the problem. They propose a two-stage masking process in which a "group mask" is used for the first stage and an "individual mask" for the second stage in recognizing single-font printed ideographs. This paper proposes a novel three-stage recognition process in which the first stage is similar but entirely unrelated to the approach of STALLINGS, and in which the second and third stages solve the same problem proposed by CASEVand NAGY,~24) but by means of a distinctly different approach. The details of WANG'S three-stage recognition scheme, which is believed to offer at least suboptimality in terms of the necessary hardware implementation, search strategy and other considerations, can be found in Reference (9). This problem can and should be viewed as a multi-category application problem in which optimality in searching strategy becomes of paramount importance. BLEDSOEraises in his review an extremely relevant question concerning the design philosophy of the system of this nature, ~25~ " . . . Is the selection of two stages in recognition the best choice? Why not three or fifty?... " The 63 subpatterns of Chinese characters chosen for our subject of investigation represent 63 standard radicals located on the left side of the character, If one adds the principal parts of the ideographs on the right, then the whole character is complete. The subset comprising the "left side radicals" can be identified if the character successfully passes the "completely vertical separability test" in the first stage of recognition. Hence, we are selecting a subset of the second stage for our experiment. We believe that a subset of 63 radicals (the average number of members in a second-stage subset) is sufficient to bring out the essential nature of this multiple-category problem. II. R E C O G N I T I O N SYSTEM MODEL The brief description of a general pattern-recognition system is shown in Fig. 1. It contains the following sub-systems:
• Sgnas
J
_1
I co o,ios
: Feotures-j Classifier j orClasses
I,
FIG. 1. Basic diagram of a recognition machine.
I °
Machine recognition Of printed Chinese characters via transformation algorithms
305
(A) Receptor. The receptor accepts the physical character and then transduces it into a measurable matrix. For example, the IBM 1287 or 1288t26~ divides a visual pattern into small elements and produces an M x N matrix over the binary field; the element becomes 1 or 0 depending upon whether it is black or white. Physically, the machine can be programmed to scan in 0.125 in. intervals over a rectangular area of 4.25 in. × 6 in. Hence a matrix of the maximum dimension 34 x 48 can be achieved. Since this useful machine was not available for our research project, all 63 radicals, taken from a standard Chinese dictionary,~27~have been carefully transduced by hand; some sample radicals (27 out of 63) are presented in Fig. 2. Each radical, contained within a matrix of 32 × 16, represents a noiseless pattern. (B) Preprocessor. The preprocessor is also called a "feature extractor." The conventional transformation techniques, such as the two-dimensional Fourier transform, the Hadamard transform, and the Rapid transform have been used with success in various circumstances in recognizing handwritten English letters or Arabic numerals. We shall use these techniques in recognizing Chinese characters. In a sense, the preprocessor extracts a secondary pattern or feature vector from the measurement vector; we hope to obtain such a feature vector by using one of these transformations. The real question is how to extract a subset of "good features" from a larger set of 512 elements (32 x 16). We are proposing that the criterion of "goodness" be judged by its amplitude distribution and its frequency of appearance. As it turns out, this criterion of selection is best for the recognition of printed Chinese characters. It is generally true that the increase of the number of features will improve the rate of recognition. However, it is also true that the complexity and size of the classifier will increase. Optimality, of course, is normally achieved by taking these trade-offs into consideration. As the result of our findings, the best scheme is to choose features according to the ordering of their "standard deviations." The standard deviation reflects to a larger degree the width and uniformity of the frequency distribution of a potential feature candidate. We shall describe all these in detail in a later section, including the comparison of three transformations. (C) Classifier. For a general pattern-recognition system, the classifier discriminates each pattern and assigns a category to it by some decision rule. The norm of "minimumdistance-to-mean" is a well known and popular classifier t28~ because it minimizes a conditional average loss for probabilistic patterns. The distance function or the norm is N
d(X,M) = IIX-MII -- ( X - M ) t ( X - M ) = ~ (x~-m~) 2
(1)
j=l
where X, N x 1, is the feature vector and M, also N x 1, is the mean feature vector (or, prototype feature vector, the transformed standard mask in our case) for the same category. However, the above norm is not entirely satisfactory. SEBESTYEN~3°~ modifies it by multiplying a weighting coefficient to each feature component, thereby improving the clustering: N
d~X, Mi) = ~ j=l
[wj(xj--mij)] 2,
where i = 1, 2. . . . . M.
(2)
306
PAUL P. WANG and ROBERT C. SmAU
,/j. ,lol,/lfl-itl-l-l-i-/,l'-]-
sl+t I J I'II l+'IBll Ir If II I" " "Iml '~ QI~I+ .I llli II II Jol I l II II I [
• IIIIIIIIIIIIIIIII II I I I 11 11 I I I I I
:
"
'I
I
"'
"
:1 L:- -
.+ +l 4rl ~]
~I
- '
I II I I il
tlil +"
I
I iIllll
I' " +I I 1 Jill IIII
qrll
II
IIII
m~III "~' ' ~II II
'II
fill
II" il
li
tl
. . . . . . . . . . .
I I
+I
II
!
"
I
!l
°1"1~"
I I I
II II
'
"I I
I I
=
I
II
i
'"';= " " "l -
ul
I I
"II ii " "
.I .I
II1
+"' ~I ~I I4
II
I
IIII
-
I
II
t_~
II
"
IIIII III III ,,,, I fll
II II iii II
+-
I
II I I II
II
I I II
!
I
lllll
II
+tl
I
IIIl
'I I
I'I"I" II IIIII
II " II
+.I "~ ~
J'
!
I
+If "
I
I
I+I'I"I I I Iill
""
"
F i l l . , "~
'
'"'*
"
~'
"
'
"'"'
NIII
I
lllIi
I~
~
I
I
I I I I ILl.If
• all++t,l+'l."itl+'..~,++.~P~,*~
,Ill
,.~+el+.lel1,1iPltl~..l+,.l.m41~-i 4, all IIIIII II II
III
+I Ill
iI
i
I I
III
III
•I "l'l
Ill Iii
Ill llllli
I II
~
•
In
:'
-II
I.~ I I---, I
,, II
ilE
I
II IIIIII II
II I! II II II
NI
+W
i1t''-" -" * I ;'~ I . , ,
---
°1
'"""
IllIlll
-
I
.~II I
+~II I
I
l~! +I +~I apI ~I
I I I I I !
.I~.I III
If
"
~I
' I
..~I
. . . . .
++"
IIl~
III I II i l
+ai
II II II II I I
II IIIII II llIll I I
J~ll ~I I [
II I I
II I I
m,I I
II
I I I I II,~.I
"'"
I I I'~kl
"~
I II
"
"
l[ II I I
!~I
I li
-,';' I i " i
'-"
I I [
II II II II
I I l
qll .ul ~pII .bl
I I I I
II II II II I I
I I I I
~. s,l I
I I
II I I
I Id131
II
IV
.~II
"' ~
'
"
I
:II lllllllIIl
III +II rfl
;11
llllllll
II
I I I I I f i i l i l i l i l I
III ' ~ llllllllIl iJliilll
." l
illli~
+-'1
'
II
,, II
II II I I
.If
~11
~
.pp~,..R~+~'p+."P
aj++e+l =I I
~I 4; II
I
I II I
11 l,
III111
I I" 1
"n
i
f .
.~
I I
Z
,~1
'
+
~
I
I I
I0
I1+t-I,
i-t
III
~
~
~-
' -
l-H-H-l-i
. . . . . . . . . . . .
k,-I,.,.J~ L t i ~ I I i I | I~.__~1
H
~
""
zl::l::l"
i i i i i i~-
-~
....
1| | r~'~rT-T-r~-[i i i i 1 i 1.~.1 J i.
!...t._.~.,.~..~.-~l~_,~411.~,~,kllli.~,~,~,~ll,~,'~l,.i.~..
~l~t-ki-iq-tt-t-t-~
iJ-l-l-lJ J ! II [ I! ,It (t (I (I (I It [i (f I r't~|L~ll
'
NII~..,
L~I~
~
ITl~r~l-r~
"
t ] I i 1 i ~
~
,,~ ~,~.ali~llii.~.~,~.~lli..~,~
-i-Ht+tm4:l~
i [ i ] ] ] i | i ]7-i
-] ] ~
]
i
+-,
~i ~
-~
"
i
"
-
3"7_1
+
~ 7
T~-T~
-
i
"
m
ii1
~
1
ill
~-17
I !1
.........
~-11-I
-L]tl l ~ i li
~
:
! I i+l
ii iL.~]TTF _.1_.1 i! !1 r-I i [li I-i ii ti Ii1~I-[-I-
~ ~-
I
I I I I I ! I~
Hi 111111t
i i ~ 7 ] ~ -
~-~,~-~-~-,-,,,,,,, |..I._.L~_L~_J I I I]
l_jl/ll
~
~- ~L~_~LI-~_]
"~'"'"'""°'"'
, -
-
~,
,"I.
°
O
~
308
PAUL P. W ~ O a n d ROBERT C. SHIAU
~i_J -'. ,p
' -,,- - ~.-.m" i
, s , r ~
::
~.et
II
) j
.,
i
) I '-j '
m i l k l I i l l
F n
)~ITm,ml)
i ,
.~_j_~
,
)imFlmllln
.,.,ii,:
,( ~
~..~r~
I '
"
mmnmmmm mmmmmmm mn mm mmm mmmmmmmmmummmmmn mmmu mimmm mm m• • m
mnm~m
-
i
,! )(
I
) )
) k
-i~ -L:H-
:i
)
iz .fF
:'
iiiii
.J-l- : :
i[iif
~H
::',FIlIIII
•
iiiii
'
I,,;
FIG. 2. The subset o f 63 radicals used in the computer simulation experiments.
Machine recognition of printed Chinesecharacters via transformation algorithms
309
For example, we may choose wj to be inversely proportional to the standard deviation of the ensemble along the jth feature coordinate, i.e. wj = 1/trj. (D) Memory. The memory unit learns the a priori knowledge of each category before the machine can be used to make any decision. Asymptotically, the sample mean turns out to be the "best" statistic if the number of samples is large enough. For our experiments, simulated on IBM 360/75, the sample mean of each pattern is assumed to be the same as those radicals duplicated from the Chinese dictionary. III. TRANSFORMATION ALGORITHMS We shall discuss in this section only some peculiar characteristics of these algorithms which are relevant to our recognition system. Well-known facts about these transforms will not be duplicated here. (A) Two-dimensional Fourier transform. The Fourier transform has a long history of application in solving pattern-recognition and image-processing problems. The general procedure is to look at the magnitude spectrum of the measurement vector and then choose as the features in an n-dimensional Euclidean space the strong amplitude of those frequency components. One of the most attractive properties of the transform is its ability to recognize the position-shifted patterns since it observes the magnitude spectrum and ignores the phase. It is well-recognized that the precision of center-location is a problem for the scanner, and it is anticipated that it will also be a problem for identifying printed Chinese characters. (B) Hadamard transform. This technique should be more acceptable in high-speed processing since the arithmetic computation involves only addition and subtraction. The major drawback of application of this technique in pattern recognition is that its performance depends too heavily upon the position of the pattern. (C) Rapid transform. The operations of the Rapid transform are the same as those of the Hadamard transform except for the absolute value operation, which may be credited with the elimination of the previously mentioned position-shifting problemJ 29~ The R-transform possess all the advantages of both the F-transform and the H-transform, but it also has disadvantages when applied to the problem of recognizing printed Chinese characters. Assuming that the pattern is represented as a one-dimensional N x 1 vector, then the R-transform requires log2 N layers, as many as the F-transform (Fig. 3). However, the processing of each layer requires only N x 4 addition, as opposed to the F-transform's N x 4 multiplications (equivalent to 400 N additions). Hence the ratio of computer time is approximately 100/log2 N. For 21 ~ 21° components, it is possible to save 100-10 times the computation cost by employing the R-transform. The R-transform also possesses the very attractive property of being invariant with respect to the slight rotation of the pattern. IV. C O M P U T E R SIMULATIONS AND PERFORMANCE EVALUATIONS The subsets of 63 printed Chinese radicals (Fig. 2) are quantized by punching 0s and ls on data cards. The programs, written in FORTRAN-IV, encode each radical in a 32 x 16 binary array and are then executed in the IBM 360/75 computer. Assuming that all the
310
PAUL P. WA,'qOand ROBERTC. SHIAU
1st Loyer
2nd Layer
FIG. 3. Tree-graph of Rapid transform for 4 input variables.
possible decision errors are weighted equally, the overall performance of the recognition system is appraised by the percentage of correct recognitions. The procedure for the major computer-simulation experiments can be described as follows:
Step I. The average characteristics of each radical are obtained through a learning process. Step II. The simulated realisticpattern is generated by adding the noise (which is generated by a two-dimensional random-noise generator) to the "average radical" obtained from Step I. Step III. Each scanned radical is transformed into frequency domain (feature space with the same dimension as the original pattern space) through F-, H- and R-Transformations. Step I V. The "good" features are selected from among the much larger set of potential candidates in the feature space. Step V. The system classifies each pattern, using the minimum-distance-to-mean criterion. Step VI. The percentage of correct recognitions is evaluated. Following is a presentation of some specific details about the experiment itself.In Step II we generated realisticradical patterns such as the one shown in Fig. 4. Assuming that M~ is the learned average character from Step I or the prototype pattern of the ith radical, and that N is the additive noise (a function of two-dimensional coordinates obtained through a random-number generator), then the simulated realisticcharacter is
Machine recognition of printed Chinese characters via transformation algorithms
o000000000000000
0000000 00:***°°°°°°00°
o000000000000000
o0000°0
--O--~O-O"~O-O-O-O-O'O-~'~O-O-Oo000000000000000 O 0 0 0 1 t l l l l l l l O 0 0 --~-O~OII'O-O-O0-O-I-L-O"O'O" O 0 0 ° l l O 0 0 0 0 1 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 -""'~--~-0-1-1-0-0-0-~0-~1-~0"0O 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 --0--0-0"0-|'|-00-0-0-01"1"00"0"
0 0 0 0 1 1 1 1 1 1 1 1 1 0 0 0 --~-O-O-O-I-I--O'O0--O-O-'I-Z-'O-O-~
oooo,~ooooo,,oo° o o o o ~ , o o o o o , l o o o 00"0--~0~-'0"0"1-1-0-0-0"00-1-
o o o o I I o o o o o I 1 o o o
oooo~,0oooo~,ooo
-""~-0-0-0-1-1-1-|-I-|~1-1-1--0-0"0
oooo**ooooo**ooo -
o 0 0 0 o 0 o o o o 0 o o 0 o o .... o o ~ o - o - o ' o " o - u - o ~ - o - ~ o 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 o o o o o o o o ooo o o o oo ---o-o-o--o~-o-0-o-o~.o-o--o--o~-o-o o o oo o o o o o o oo o ooo oo o o o o oo o ooo o o o o --o o o o o o o-o o o ~ o ' o o - o o " ooo o o o~o oooo o ooo
ooo
o o ooo
o o oo
oooo
Proto~pe Pattern
IIII:I°,
-O-O-O'O~O-Og'O:O'O'O-O-O-O-O000 0000 000000 0000 00000000000 -00000000000000000 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 - 0 " 0 " 0 - 0 0 " 0 0 - 0 0 0-0 0 0 0-0 0~ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0"0
°ooooooo,o°ooo _~O°oo o o * : ~ X ~ o o o o o,o~OO 0 0 ° 0 ~ ~ °
~oo~°ooooooooo-
o o 0 o o o
o0oo
0o0000oo0
o00-o
0-0o0o0-0oo-
"00
-0-00-0-|-1-00-0"0"0
~
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 o ooooo o oo oooo oo o o o o o o o o o o o o-o o - ~ ooo o o ooo ooooooo
oo
o o o oooo
ooooo
oo
o o o 0 o.o o o 0 - o oo o o 0ooo
o o oooooooo
Gaussi~
IO00
00001111111
L
O-
--O'O'O-O-~-I-O-O'O0-O-ll-O0000001100000 1000 00001100000 1000 - - 0 ~ - 0 - 0 - 1 - - 1 - 0 - 0 0 0-0 ~ 1-0 0 OO0001100000 lO00 00001100000 1000
o oooo ooooooooo
o 0-o000 o ooooooo
&0 ° I I : : I : ~
--0-0-0-0~0-0-000-00-~00000~1111110000 000000 110~
O 0 0 0 0 0 0 0 0 O O 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
-U'O-O0"O0-O
311
o o o o o o
Noise
1000
--0-0-0"-0-1-1"-0-0000-1-1.00 0 0 0 0 1 1 0 0 0 0 0 1 1 0
oooo , o : x ~ -o-°.o, ,0 0110 0ooo,,ooooo,,o,~ o,oo,moo,
--0-00"0"1"I-|-|-1~1-11000-
-
~X
o_o,,ooooo:::::
0 ~ o 0 0 0 ~ 0 0 o o -o-O-O~-o-o-o-o-o~-ooo0-o-o-
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 oo ooo ooo o ooo o o Oo --o-oo--o-o-o-o-o-o o-o o-o o o o o o oo o ooooo o o o ooo oo oooo oooooo oo oo -~-o~-o-o-o-o--o-o-o--o-o o o-o-ooo oooooo oo oo oo Oo
oo
o o oooo
Realistic
o ooo
o ooo
Pattern
FIG. 4. Generating a realistic ~ d i ~ l ~ t ~ m .
Mi ~ N (the binary operator ~ will be defined later). Physically, the noise represents an improper printing process or the situation in which the machine noise in the recognition system distorts the pattern. We further assume that the noise for each cell is a Gaussiandistributed continuous gray level, independent of the two-dimensional coordinates. However, the Gaussian-distributed noise N(0, cr2) is quantized by a "well" function, as shown in Fig. 5. If the noise for a specific cell (i, j) has a gray level greater than "1" or "black," it will be quantized into "1". In other words, every cell will eventually be classified either "1" or "0". We can conclude, therefore that the binary operation ~ mentioned above i m p l i e s : 0 ~ 0 = 0 , 0 ~ l =l,l~0=l, andl~l = 1. We chose three transformation algorithms for Step III and were curious as to what each algorithm could do for this problem. The technique of F-transform is well known and has been widely applied. The FFT subroutine devised by Cooley and Tukey of M.I.T. was used in the simulation. The computer programs were written* forIBM 360/75, based upon the existing algorithms of the Hadamard transform (31) and the Rapid transform. (29) We have shown in Fig. 6 a sample of only the R-transform printout, as the other two transforms assume the similar format. In Step IV, the task was to reduce the dimensionality of the feature space. The criterion of the feature selection has been described briefly in Section II. The histogram of the first 9 features obtained via Rapid transform with a sample size of 63 is presented in Fig. 7. Because the symmetrical property of the data set appeared in the frequency domain, we have narrowed the choice for features to 128 (16 x 8). The standard deviations for F-, H- and R-transforms are also presented in Tables 1-3 respectively. Eight "good features," * As this presentation is rather abbreviated, the interested reader may write to Dr. Wang at Duke University for further information.
312
PAUL P. W A N O and ROBERT C. SHIAU
,,p(A)
Black Level
Block Level
i 'l I I
| I
I I
I
:'o : le,, ~
~
,.., . , . . ,B
!
%%
=0.36-
-I
"' "
'1
"
"
"" "'
~A
( A m p l i t u t t e of N o i s e )
L,--White Level-~
FIG. 5. Quantizing the continuous Oaussian gray level into two levels.
selected according to the larger ordering of the numerical values and the uniformity distribution in the histograms, are circled in Tables 1-'3. This criterion is certainly not the only one that we have explored. The performance of several others did not meet our expectations, but we do want to mention specifically two methods which are believed to be at least principally sound in comparison with the "standard deviations scheme."
0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0.0.0,0.0.0.0. 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.1.1.1.1.1.1.1.1,1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1,1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1,1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.1.1.1.1.1.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0.0.1.1.0.0.0. 0.0.0.0.1.1.1.1.1.1,1.1.1.0.0.0. 0.0.0.0.1.1.0.0.0.0,0.1. z.0.0.0, 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0,0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0,0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0,0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0,0.0.0.0,0.0. 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0. 0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.
87. 3. 33. 33. 15, 3. 3. 3. 23. 3. 3. 3. 15. 3. 1. 1. 23. 3. 3. 3. 15. 3. 1. 1. 23. 3. 3. 3. 15. 3. 1. 1. 13. 1. 3. 3. 5. 1. 1. 1. 13. 1. 3. 3. 5. 1. 1. 1. 13. 1. 3. 3. 5. 1. 1. 1. R '1' 13. 1. 3. 3. " " ~ 5. 1. 1. 1. - 61. 1. 27. 27. 5. 1. 1. 1. 3. 1. 1. 1. 7. 3. 3. 3. z. 3. 1. z. 7. 3. 3. 3. 1. 3. 1. 1. z3. z. 3. 3. 5. z. z. z. 13. Z. 3. 3. 5. ].. ].. 1. z3. z. 3. 3. 5. 1. 1. 1. ] 3 . Z. 3. 3. S.Z. Z. 1.
63. 9. 9. 1. 9. 1. 9. 1. 5. 3. 5. 3. 5. 3. 5. 3. 53. 3. 3. 11. 3. 11. 3. 5. 3. 5. 3. 5. 3. 5. 3.
3. 33. 3,3. 51. 33. 3. 3. 3. 3. 15. 3. 3. 3. 3. 3. 19. 3. 3. 3. 1. 1. 15. 1. 3. 3. 3. 3. 19. 3. 3. 3. 1. 1. 15. 1. 3. 3. 3. 3. 19. 3. 3. 3. 1. 1. 15. 1. 3. 1. 3, 3. 9. 3. 1. 1. 1. 1. 5. 1. 1. 1. 3, 3. 9. 3. 1. 1. 1, 1. 5. 1. 1. 1. 3. 3. 9. 3. 1. 1. 1, 1. 5. 1. 1. 1. 3. 3. 9. 3. 1. 1. 1. 1. 5. 1. 1. 1. 27. 27. 33. 27. 1. 1. 1. 1. 5. 1. 1. 1. 1, 1. 1. 1. 1. 1. 5. 5. 5. 5. 1. ]. z. 1. z. z. ] . 1. 5. 5. 5. 5. 1. 1. 1. 1. 1. 1. 1. ].. 3. 3. 9. 3. 1. z. 1. z. 5. z . z. Z. 3. 3. 9. 3. Z. z. z. 1. 5. ] . z. z. 3. 3. 9. 3. z. 1. 1. 1. 5. 1. 1. ]. 3. 3. 9. 3. Z. Z. Z. Z. S. 1. 1.
3. 33. 27. 3. 3. 3. 9. 3. 3. 9. 3. 3. 3. 5. 1. 3. 3. 9. 3. 3. 3. 5. 1. 3. 3. 9. 3. 3. 3. 5. 1. 3. 1. 3. 3. 1. 1. 1. 1. 1. 1. 3. 3. 1. 1. 1. 1. 1. 1. 3. 3. 1. 1. 1. 1. 1. 1. 3. 3. 1. 1. 1. 1. 1. 1. 27. 25, 1. 1. 1. 3, 1. 1. 3. 1, 1. 1. 7. 5. 1. 1. 3. z , 1. 1. 7. 5, .l. 1. 3. 1, 1. 1. 3. 3. z. z. z. z. z. 1. 3. 3. 1. z. 1. z. ] . z. 3. 3. z . 1. 1. 1. 1. 1. 3. 3. Z. Z. Z. Z. Z.
FIG. 6. An example of the computer print-out for the Rapid transformation.
3. 3. 3. 3. 3. 3. 3. 3. 1. 1. 1. 1. 1. 1. 1. 1. 1, 1. 1. 1. z. 1. 1. z. z. 1. z. x. 1. Z. Z.
Machine recognition of printed Chinese characters via transformation algorithms
313
30
20 Ld o
z I..iJ
10 o
R(O,O)
50
R(OJ)
t00
5
~o
R(O,2)
-----
15
2o
'
,
'
40
, 60
,
IT
20
~) Z I..d D
R(I ,O)
R(1,1 )
'
0
50
1""1
100
20
5
I 10
R(1,2)
I
15
R(2,0)
I
[
I
20
I
4O
I
6O
R(2,1)
R(2,21
10 0
I
rl
50
1
400
5
NUMERICAL
'
15
10
VALUES
OF THE
.
20
.
.
40
.
60
FEATURES
FIG. 7. Histogram of the first 9 features obtained via Rapid transform.
The "standard deviation scheme" may be classified as one member of the family of techniques called "principle component analysis," because the decision is based on fewer but more relevant features. A quite distinct method, designated as the "least correlation scheme," was suggested to the authors by Professor Woodbury. Referring to Table 3, one observes that the R(0, 0) component yields the maximum tr. Hence, R(0, 0) is chosen as the first feature for the classifier. However, the second feature is selected according to the least coefficient of correlation between R(0, 0) and the other 127 candidates. Computation results of Table 4 indicate that R(10, 6) is the component least correlated to R(0, 0). The third feature is then chosen according to the following criterion. Third Feature = min{Irul + Ir2il- Irurzil}
(3)
i
where the data for r~i and r2i are available from Tables 4 and 5. For this problem, the answer is R(1, 5). It is believed this method is the best among known techniques. Surprisingly, the performance of the recognition system fell considerably (see Fig. 8). As a result, another heuristic scheme, believed to have an expected performance somewhere
TABLE 1. STANDARDDEVIATIONSOF 128 POTENTIALFEATURESVIA F-TRANSFORMATION F
0
1
2
3
4
5
6
7
6.,
o
,.,3
.24
.43.3.80 2 . 3 0
2
6.53
5.40
5.85
5.67
5.74
3.92
2.55 2.00
3
4.84
4.95
5.15
5.00
3.98
2.96
2.37
1.67
:
:
:
:
:
;
:
:
15
i
PAUL P. WANG and R O ~ T C. Sh'mU
314
TABLE 2. STANDARDDEVIATIONSOF ] 28 POTENTIALFEATURI~ VIA H*TRANSFORMIATION H
0
1
2
3
4
5
6
7
o
360
1~
3.76
3.03
3.14
3.43
4.26
4.87
3.83
2 ~
2.80
4.61
5.33
4.4
4.63
6.25
3.75
3 13.43 2.60
5.76
4.53
4.91
4.90
7.14
3.66
:
:
:
:
:
:
15
:
11.45
:
TABLE 3. STANDARDDEVIATIONSOF 128 POTENTIALFEATURESVIA R-TRANSFORMATION R
0
1
2
3
0 2~____~i3.19 ~ 6 . 0 8
4
5
~3.72
6
~
7
5.64
m
1(~1.44
2.09 1.55
3.11 1.28
1.83 1.17
2 ~'~-'~1.91
3.82 2.15
3.05 1.63
2.32 1.35
3
1.16 1.22
2.10 1.00
1.41 0.90
4 <~2.19
3.40 1.97
4.46 1.70
2.46 1.45
5
5.67 1.08
1.65 1.03
1.80 1.16
1.23 0.82
6
5.54 1.55
1.76 1.12
2.18 1.16
1.51 1.06
7
4.86 0.86
1.17 0.69
1.40 0.83
0.88 0.58
2.37
4.54 2.16
6.09 2.22
3.71 1.70
1.56 1.1.4 2.21 1.05
1.53 0.86
8.52 1.11
8 ~ 9
5.46 1.44
I15
:
:
"
:
:
:
":
:
:
I
TABLE 4. COEFFICIENTSOF CORRELATIONBETWEENR(0, 0) AND THE OTHER HARMONICS R(0
1
0 i~i.000~-0.3975
2
3
0.3297 -0.0882
4
5
0.2910 -0o1517
6
7)
0.1313
0.0325
0.0137
0.0305
i
1 0.6065 -0.1339
0.1417
2 0.4819
0.0762
0.0068 -0.1617
3 0.4171
0.1134 -0.1020 -0.0301
4 0.4117 -0.3617
0.0171
0.3374 ~
0.0912 -0.1472 -0.2616 -0.1897 0.4402 -0.0778
0.0478 -0.0265
0.1034 -0.1456 hI 0.0110 -0.0265
0.1119 -0.0564
5 0.3095 -0.0634
0.27001-0.1677 , 0.18221-0.0158
6 0.2024 -0.0353
0.1087 -0.1413 -0.0219 -0.20241-0.1674 -0.1204
0.1975
0.1076
7 0.1901-0.0467
0.2640
e 0.4003-0.0610
o.oe4e-0.0817-0.0087-0.2524
0.1563-0.0317
9 0.4579
0.2871
0.0964
0.0348
0.1488
0.0188
0.0821-0.x501
0.1500-0.1022
10 0.2180 - 0 . 0 5 3 5
0.0801 i -0.0866 -0.1366 - 0 . 2 3 0 3 ~
ii 0.2413 -0.0724
0.0985 -0.0945
15
:
:
:
:
0.0481 -0.1400 :
:
0.1718,
0.0608
0.0510 -0.2762
0.1393 -0.0746 :
:
Machine recognition of printed Chinese characters via transformation algorithms
315
TABLES. CoE~IC1ENTSOFCORRELATIONBEWEN R(I0,6)ANDTHEOTHER HARMONI~ 1
2
3
4
5
6
7~
0 -0.0049
R(0
0.0123
0.1218
0.0948
-0.2539
0.~048
-0.1562
0.0147
-0.0754
0.1075
0.2311
0.0818
-0.1458
0.0071
-0.1161
0.0388
2 -0.2353
0.1536
-0.0655
-0.0718
-0.0797
0.1102
-0.1119
-0.1312
3 -0,2584
0.2787
0,0189
0.0072.
-0,1387
0.1817
-0,0035
-0.0921
0.1071
-0,0350
0.0483
0,1207
-0,0222
0,0470
-0,0900
0.2842
5 -0,0815
-0.3156
0.1881
0,1723
0.0073
0.0268
0.2038
0,1412
6 -0,0917
0,0541
0.0216
-0,21~9
0.1558
-0,0167
0.2519
0.1986
7 -0.1093
-0.1281
-0,0207
-0.1063
0.1683
0.0681
-0.1408
-0.0754
i
4
8
0,1534
0.1093
0,2005
0,2398
0,2032
-0,0342
0,4092
0,1295
9
0,0640
0,2180
0,0844
-0.0158
-0.0673
0,2551
0.0868
0,0193
10 - 0 , 0 0 7 1
0.2030
0,2169
0,0241
-0,0033
0.1138
1,0000
0.1728
0.1264
-0,038~
-0,3174
-0,1264
0,1406
0,1217
0,3680
0,1526
12 - 0 . 0 2 9 3
0,0315
0,0095
0.1564
0,1342
-0,0008
13 - 0 . 0 7 0 4
0.0180
0.0307
-0.0802
0.2171
0.1062
-0.0921
"0.0045
14
0.1518
0.0511
0,2888
0,0995
0,1253
-0,0535
0,2542
0.0097
15
0.0544
-0,0080
0.0919
0,0278
-0,2248
0,0566
0,2379
0,0301
11
1CX~
i
i
i
i
!
!
I
I
-0.111~
I
Z 0
r-- B¢ ~) 7C LU
E
6C
~i 5C
~ 4C
2C
a Star.lard Deviation Method ~ Ir~ovation Method o Correlation Method
1C i
I
I
I
I
4 5 6 7 8 NUMBER OF FEATURES
I
9
FIG. 8. Comparison of the recognition rate with the number of features.
0.1594
316
PAUL P.
WANG
and ROBERT C. SHIAU
between the two methods mentioned above, was tried out by the authors. In short, we defined the "innovation of information" contained in the ith feature as follows e / = ~ri(1-lrlil)
(4)
where r ~ i s the coefficient of the correlation between R(0, 0) and the ith harmonic. Hence, the second feature is decided by I~V -l a~A i ~~O i l~, where the third feature is determined by Max{ei = tri(1 -Irti I -Ir2i I + Irlir2il) }. i
Our findings indicate that the performance of the "Innovation Scheme" falls somewhere between those shown in Fig. 8. We return to the discussion of the major simulation program in Step V. Distance function between the testing feature vector x = [xt, x2 . . . . . Xs] and the ith standard mask M~ = [m,, mi2. . . . . mis ] is programmed by normalizing thejth feature coordinate with a~. A portion (21) of the distances computed for the 63 character subset is presented in Table 6. As expected, this tabulation is constructed with standard masks against the noisy patterns.
TABLE 6. DISTANCE MATRIX BETVVEEN THE NOISY AND THE NOISELESS RADICALS NOiSy Radicals
2 ~ 5 2784 2 ~ 4 3 7 e 12.96~5.28 3.35 2.97 12.90 5 . 3 1 ~ 2 . 9 0 s.so ~.11.03 2.82 2 . 0 9 ( i ~ 2 . 9 1
3.0--0 3.13 2.90 4.17 4.64 4.13 2.12 2.63
2.42 3.00 3.73 1.62
2.29 1.19 4.61 2.25
3.05 4.09 3.65 2.00
2.37 3.57 3.10 1.42
3.67 3.36 5.20 2.92
2.16 4.75 2.30 3.19
3.33 5.70 2.73 3.61
2.07 3.90 3.s0 2.24
3.24 2.51 4.76 2.22
2.93 2.03 4..5 2.26
3.20 2.27 9.04 2.27
x.6s 3.94 2.11 1.77
3.45 3.70 4.86 2.80
2.13 3.65 2.92 1.63
~"13.17 2 . 3 1 5 . 2 5 2 . 4 8 ~ 1 . 3 7 2.62 1.74 1,95 2.8B 2 , 3 4 1.03 4 . 9 1 5.09 2 . 2 5 0 . 7 4 0 . 9 7 0 . 7 3 3.60 1.60 3.06 J,~12.62 2.73 4 , 7 5 2 . 1 3 1 . 3 9 Q 1 , 6 9 1.67 2,2B 1.92 1.92 1.72 4.34 4.69 1 . 9 1 1.38 1.66 1.27 2.94 1.23 2.22
~ .~
~,12 14 3 7 6 3 9 6 1 9 2 2 5 1 ~11.68 2.43 3.69 1.08 1.82 ~ t2.65(~)5.05 2.89 2.32 ~ ) 2 . 0 9 3.68 3.13 1.32 2 . 9 , ~11.62 3.12 3 . 1 7 C i r ~ : ) 2 . 3 0 ~ 1 3 . 2 2 2.69 5.22 2.49 0.41 ~ 1 2 . 0 1 4.72 2.62 3.08 4.95 12.88 5.53 2.46 3.o3 4.95 , t 1 . 4 3 3.26 3 . , 1.13 2.51 ~i2.95 2.19 4.86 2.11 0.89 ~13.06 1.07 5.32 2.B6 1.61 ~13.02 1.77 5.14 2.45 1.19 ~10,96 3.58 2.18 1.45 3.66 ~12.42 2.95 4.59 2.01 1.39 ~Ii.47 3.53 2.98 1.18 2.93
1.4701.99 3 0 , 1 . , 5 1 4 9 2 3 4 3 3 5 3 5 3 1 2 0 2.21 2 6 , 2 9 0 2 0 5 ~ 2 9 1 1 5 1.56 2 . 1 6 @ 1 . 6 4 2.25 1.10 1.66 3 . 4 . 3.71 1.43 z.z2 1 . 2 . 1.48 2.17 1.40 1.93 2.43 3.65 2 . 3 s ~ 1 . 7 2 3.03 2.63 4.47 5.22 3.26 2.23 1.35 1.71 3.59 3.06 1.44 1 . , 6 1.64 2.06 3 . 0 7 ~ ~ . 2 , 2.98 3 . , 3.79 1.90 2.51 2.72 2 . , 1.74 2.35 1.17 1.59 1.52 0.86 2.31 1 . 5 6 0 2 . 0 7 3.04 3.2o 0.98 ~.65 1.99 2.o6 1.51 1.70 1.12 1.62 ~.51 1.60 2.19 2 . 9 , 2 . 2 6 ~ 4 . , 4 . , 1 1 . 9 , 0.65 1.23 1.16 3.57 ~.36 3.04 4.10 3.50 3.41 3.84 3.85 3.12 4 . 5 9 ~ 2 . 0 4 3.32 4.33 4 . , 4.55 1.87 4.12 2 . 5 , 4.39 3.33 3.28 4.61 3.77 2.67 4.37 1 . 9 5 ~ 2.85 4.3o 4 . , 4.87 2.19 3.94 2.67 1.96 1.61 o.92 2.36 2.12 0 . , 2.10 2.63 2 . 6 9 ~ 1 . . 7 2.21 2.36 1.44 1.75 1.23 1.43 2.66 1.39 1.76 2.84 2.08 1.18 4.57 4.80 2 . 1 2 0 0 . 9 5 0.60 3.30 1.77 3.88 2.18 3.49 2.15 1.03 3.57 2.91 2.04 4.94 5.40 3.04 1 . 7 0 G I . 2 3 3.83 2.68 3.53 1.31 2.89 1.97 1.64 2.91 2.49 1.79 4.79 5.29 2.64 1.19 1 . 0 9 ~ 3 . 5 2 2.13 3.07 2 . 6 7 2.39 2.11 2.83 2.28 1.64 3.45 1 . 7 4 2 . 6 7 2 . 2 1 3.00 3.09 3 . 1 7 G 3 . 0 0 1.20 0.72 1.29 1.59 2.33 1.95 1.66 1.41 4.02 4.22 1.41 1.28 1.73 1.49 2 . 7 0 ~ 1 . 9 7 1.76 1.29 1.B1 2.81 1.09 1.04 2.81 2.77 3.23 1.54 2.40 2.86 2.60 1.16 3 . 0 9 ~
The important simulation results are shown in Figs. 9-14; they are all plotted as the percentage of correct recognitions vs the number of features. The standard deviation o has been used as a running parameter throughout these figures. It is important to note here that the noise level a does not necessarily stand for the net noise added to the deterministic character pattern. Assuming that the pattern X is a sample of multivariate and normally distributed density with mean vector M and covariance matrix . . I , then the "noise level or" is defined as ~r = (~r.7,1+o~" 1)- t where the additive noise N(0, a.I) is also
Machine recognition of printed Chinese characters via transformation algorithms
317
TABLE 7. COMPARISON OF THREEKINDS OF TRANSFORMS
Properties
F.F.T.
H.T.
R.T.
normal patterns recognition
second
the best
third
shifting position
independent
dependent
independent
dependent
dependent
independent
second
the best
third
0.8
1.0
0.08
0.i
rotated pattern ability to
combat noise learning time (see.)
15
per character recognition time (sec.)
1.5
per character
Gaussian distributed. The performance of the recognition machine via F-, H- and Rtransforms is shown in Figs. 9, 10 and 11 respectively. We may conclude that the most influential factor is the noise level a. With a very low noise level (a = 0.26), 3 features are more than enough to do a nearly perfect job for all transforms, while the F-transform requires only two. Under normal operation, the performance of the H-transform is the best (Fig. 12) and it has the best mobility for combatting noise (as indicated in Figs. 9-11 when a is very large). However, this is not necessarily true under the quasi-abnormal or unusual situations such as shifted patterns or shifted and rotated patterns. In the case of shifted patterns, the H-transform is nearly useless, whereas the performance of the other two transforms remains essentially intact, as indicated in Fig. 13. If one uses the testing
O• 0
80
0.36 O. :'8
70
0.~ 0.3( )
~BO
"
5O
0.40
| 0
1
2
3 4 NUMBER
5 6 7 OF F E A T U R E S
8
9
FIG. 9. Performance of the classifier via F-transform.
318
PAUL P. WANG and Roamer C. SHIAU
~, 8 0
0.34 -
o~ ~° hi
,,~.~
-
.28
-
6
-
30 20
1 n 0
I
I
1
2
I
I
3 4 NUMBER
I
I
I
5 6 7 OF" F E A T U R E S
I
I
8
9
FIG. 10. Performance of the classifier via H-transform.
patterns which are both shifted and rotated with a small angle, only the R-transform survives. Hence, one may conclude by investigation of the curves shown in Fig. 14 that the R-transform is the best algorithm to use in an uncertain situation. The above discussions are summarized in Table 7 for the purpose of comparison.
~ 60
O.38
"°
0
1
2
3
4
NUMBER
5
6
7
8
9
OF FEATURES
FIG. I I. Performance of the classifier via R-transform.
Machine recognition of printed Chinese characters via transformation algorithms
319
(n I 0 0 z
o 90 F-
w 70 Ix
~
50
u. 4 o 0 w ~ 30
z 2o
0
I o
1
2
5 4 NUMBER
5 6 7 OF FEATURES
8
9
FIG. 12. Comparison of three methods with noisy character patterns.
V. C O N C L U D I N G REMARKS This paper intends to answer some of the questions concerning the design of a recognition machine capable of handling a dictionary of 7000-8000 different printed Chinese ideographs. These questions, however, will not be completely settled until the system becomes a commercial reality. The task is by no means an easy one. To ease already mounting
lOO
(n
I
,
~
-
I
w 701a60-
w ~
-
50-
8 [, 4 0 RT
0 30
I:1
FFT
~
HT
0
--
Z 20 [d (3 E 10 UJ Q. 1
2
3 4 NUMBER
5 6 "7 OF FEATURES
8
9
FIG. 13. Comparison of three methods with shifted patterns.
320
PAUL P. WANG a n d Ronm~l" C. SHL~U
1OO
I
I
I
I
I
I
I
I
~:~-
i B° ~- 6 O
8
~ 4o W 30
~
FFT cr 0 . 3
2O
0
z
. I
FIG. 14. C o m p a r l s o r l
.
. 2
.
3 4 NUMBER
.
I 5 6 7 OF" F E A T U R E S
8
9
o f t h r e e m e t h o d s w i t h s h i f t e d a n d r o t a t e d patt¢rrls.
doubts for a commercially profitable system, we call to the attention of interested workers in the field the importance of having some kind of standardized fonts or, perhaps 2 or 3 standard fonts at most; these must be universallyadopted if the Chinese characters are to be read by the digital computer, or in the realization of the even more ambitious goal of an entirely automatic recognition and machine-translation system. The selection of the small subset of 63 radical characters used in this experiment is primarily based on the proposed three-stage hierarchy structure with the understanding that the second and third stages can be implemented if the experiments in this paper turn out to be successful. The key to achieving recognition is the transformation algorithms of Fourier, Hadamard, and Rapid. The feature extractor is to compress the useful information in a compact set, and we have discovered that when the size of the feature set is small enough, the system should present no difficulty in hardware implementation. We also disclose that the Rtransform is perhaps the most attractive technique among the three, not only because of its ability to handle the shifted and rotated pattern, but because it requires minimal time to process the algorithm. Under normal conditions the R-transform is the least efficient of the three. There are still many questions in this general area of research which call for answers, some of which are extremely fundamental in nature. For example, finding the optimal scheme for dividing a complete Chinese dictionary into distinct subgroup characters which share some common characteristics is also an optimal multiple-category searching problem.
Acknowledgements--The a u t h o r s
wish to t h a n k Professor MAx WOODnURY for suggesting the "least correlated features extraction scheme," and Mrs. JULIA WAN(; for her m a n y hours o f effort in preparing the samples o f printed Chinese characters.
Machine recognition of printed Chinese characters via transformation
algorithms
321
REFERENCES 1. Survey of the need for language translation. Planning Research Corp., IBM Survey Rept. RC-634 (I 2 March 1962). 2. G. W. KING and H. W. CHANGE,Machine translation of Chinese, Scienr. Am. 124-136 (June 1963). 3. J. H. LIU, Real time Chinese hand writing recognition machine. MS. thesis. M.I.T. (1966). 4. H. C. TAO, On a Chinese computer alphabet for automatic machine processing, MS. thesis. Dept. of Computer Science, U.N.C. at Chapel Hill (1966). 5. S. K. CHANG, A method for the structural analysis of two dimensional mathematical expressions, hformorion Sci. 2, 253-272 (1970). 6. G. F. GRONER,J. F. HEAFNER, and T. W. ROBINWN.On-line computer classification of hand-printed Chinese characters as a translation aid, IEEE Trans. E/PC. Computers EC& 856860 (Dec. 1966). 7. F. F. Let. A Chinese typesetting machine, Quarterly Progress Report, Research Laboratory of Electronics. M.I.T. pp. 69-71 (April 1953). 8. ‘FUJITSU, A special issue of Kanji Information Processing, 21, No. 7 (1970). 9. P. P. WANG, The topological analysis and classification of printed Chinese characters. Technical Report. Dept. of Elec. Engrg., Duke University (1972). IO. S. H. CAU)WELL,The Sinotypca machine for the composition of Chinese from a keyboard, J. Franklin Insr. 267.471-502
(19S9).
1I. S. Kuaor~, T. HONMA.and T. TSUCHIYA,Chinese character printer with electrostatic recording, l/X!? Trans. Elecrron Devices ED19 (4), 569-579 (1972). 12. B. KIRKRANKM, III, W. A. SILLARS,and R. W. HSU. On the pictorial structure of Chinese characters, NBS Technical Note 254, Jan. 4 (1965). 13. B. K. RANKIN, SIEGEL,MCCLELLANDand TAN, A grammar for component combination in Chinese characters, NBS Technical Note 296 (1966). 14. B. K. RANKIN,A linguistic study of the formation of Chinese characters, Ph.D. Thesis, University of Pennsylvania ( 1965). 15. B. K. RANKINand TAN, Component combination and frame-embedding in Chinese character grammars, NBS Technical Note 492 (Feb. 1970). 16. E. I. BURKAIXT, A procedure for decomposing Chinese-Japanese ideographs, Ph.D. Thesis, University of Pennsylvania (1968). 17. T. SAKAI, M. NAGAO,and H. TERAI, A description of Chinese characters using sub-patterns, Information Processing in J4pM 10, IO-14 (1970). 18. T.-Y. KIANG,A new design scheme of Chinese character encoding and decoding system, Engineering Journal No. 13, National Taiwan Univ. (May 1969). 19. T.-Y. KIANG.A compatible Chinese character encoding system, Proceedings of the Fourth Annual Princeton Conference. pp. 41&420 (I 970). 20. T.-Y. KIANG,A new index system for Chinese language, J. Elec. Engng 1, 1-I I (1970). 21. W. STALLINGS,Computer description and recognition of printed Chinese characters, Proceedings, Spring Joint Computer Confereocc. pp. 1015-1025 (1972). 22. W. STALLINGS.Recognition of printed Chinese characters by automatic pattern analysis, J. Computer Graphics Image Processing 1 (I), 47-65 (March 1972). 23. W. T. YANGand C. P. WV, On pattern recognition, Engineering Journal, No. 13, National Taiwan University, (May 1%9). 24. R. CASEYand G. NAGY, Recognition of printed Chinese characters, IEEE Truns. Elec. Compurers ECl5, 91-101 (1966). 25. W. W. BLED~OE, Review of Casey and Nagy’s paper, R66-65. IEEE Trans. Elec. ComputersECIS, 839 (1966). 26. Optical Characrers Recognition und rhe Yeurs Ahead, p. 297 (Feb. 1947), The Business Press, Elmhurst, Illinois (1969). 27. #;R.kLjib)&f.-m.i)&;yf~~. $1 (Feb. 1947). 28. N. 1. NI-N. burning Machines. p. 57. McGraw-Hill, New York (1965). 29. H. RE~TB~ECK and T. P. BRODY,A transformation with invariance under cyclic permutation for applications in pattern recognition, Inform&on ond Conrrol15. 130-354 (1969). Decision-making process in patterns recognition, ACM Monograph, Macmillan, New 30. G. s. SEBESTYEN, York (1962). 3 I. H. C. ANDREWS.Computer Techniques in Image Processing. pp. 120-123. Academic Press, New York (1970).