Journal of Manufacturing Systems Volume 12/No. 4
An Evaluation of ART1 Neural Models for GT Part Family and Machine Cell Forming T. Warren Liao and LianJiang Chen, Louisiana State University, Baton Rouge, LA
Abstract
method. 6-~7 One common drawback of these algorithms, however, is that they do not have learning capabilities. This limitation requires the rerun of the algorithm with the whole set of part flow data (old and new) for classifying any new part into part families. This study looks into the pros and cons of using an unsupervised, self-learning neural system, particularly the ART1 neural model, for GT part family and machine cell forming. ART1 stands for the binary adaptive resonance theory, which was first introduced by Carpenter and Grossberg in 1986.1s23 Unlike the nonlearning algorithms, 6-17 the ART1 neural models are able to learn and store learned patterns according to their learning disciplines. For classifying any new part to part families, only the input pattern for the new part needs to be processed. Depending upon whether the new input pattern is close to or far from the previously stored patterns, the ART1 system will match it to one of the previously stored patterns or store it as another distinct new pattern.
This paper describes ART1 neural models for GT part family and machine cell forming. An ART1 neural model was first implemented in C and was tested with examples taken from the literature. The ART1 model was then integrated with a feature-based design system for automatic GT coding and part family forming. It was finally incorporated into a three-stage procedure for designing cellular manufacturing systems. Our evaluation concludes that ART1, when compared with nonlearning algorithms, is best suited for GT applications due to its fast processing speed, fault tolerance and learning abilities, ease of classifying new parts, etc.
Keywords:Neural Network, ART1 Neural Model, GT Part Family Forming, GT Machine Cell Forming, Cellular Manufacturing System Design
Introduction In a manufacturing environment where a wide variety of medium-volume products are manufactured, cellular manufacturing based on the group technology (GT) concept has been recognized to be the best. ~ To implement cellular manufacturing, parts and machines must first be classified into groups. Three general methods for grouping parts and machines into families are visual inspection, classification and coding, and production flow analysis. 1 The visual inspection method is less expensive; however, it is also less accurate. Several computer-aided classification and coding systems such as DCLASS have been developed. These systems usually have two major drawbacks--inflexible coding schemes and inconsistent results. Efforts have been made to develop automatic classification and coding systems for forming part families to remedy these problems, z-s Numerous algorithms have been proposed for the production flow analysis
Neural Networks Neural networks, also called artificial neural systems, parallel distributed processing systems etc., are developed to model how the human brain processes information. A general and rigorous definition of neural networks has been proposed in Reference 24. In general terms, a neural network is a parallel, distributed information processing structure consisting of a huge amount of linear or nonlinear simple neurons (which can possess a local memory and carry out localized information processing operations) as the basic processing elements. They are
282
Journal of Manufacturing Systems Volume 12/No. 4
interconnected with unidirectional signal channels (called connections) into multilevel networks. Each neuron has a single output, which branches into as many collateral connections as desired. Each neuron carries the same signal--the neuron output signal. This signal can be of any mathematical type desired. The processing that takes place within each neuron must be completely local--it must depend only upon the current values of the input signal arriving at the neuron through impinging connections and upon the values stored in the neuron's local memory. Generally, a neural network has an input layer to receive data from the outside world and an output layer to send information to users or external devices. Layers that lie between the input and output layers are called hidden layers and have no direct contact with the environment. Neural networks may or may not have hidden layers. To date, many kinds of neural network architectures including the ART models, la'2a Hopfield models, 2s-27 Back-Propagation models, 28 Kohonen's models, 29 etc., have been developed. The topology of a generic neuron is shown in F i g u r e 1. The input signals come from the environment, outputs from the lower layers, or feedback signals from higher layers. These input signals form an input vector X = (xo ..... x i ..... xN.l), where x i is the activity level of the ith input. Associated with each connected pair of neurons is an adjustable value called a connection strength or weight w i, representing the connection strength from the neuron x i to the neuron y. All wi form a vector W = (Wo..... wi ..... W~v_,). If an internal threshold value must be exceeded for activation to occur, an additional parameter ® must be subtracted from the net input to the neuron. Mathematically, the output value y can be calculated as y = f(
N-1 Y~ x i wl--O) i-O
INPUT
~
UTPUT
~.i kj iv%
I
f (x)
'-1° Hazd
Limitel
Thleshold
--
x
Logic
,
f (x)
01"
- - x
Slgmold
Figure 1 One Neuron and Three Representative Threshold Functions
ondly, it has strong fault tolerance and associative ability. Certain amounts of disturbance, distortion, or damage to its input data will not affect convergence to the stable output. However, several more steps may be necessary. Furthermore, it is able to adjust its internal connections automatically according to the input data and therefore possesses a strong learning ability. When used as a classifier, there aren't any specific limitations to the input data. Thus, it is suitable to classify input data that are nonlinearly, non-Gaussian distributed. This article evaluates the performance of ART1 neural models for GT part family and machine cell forming and compares them with other conventional methods. The ART1 model was chosen for this application because it is self-organizing and does not require external supervision. The simplicity for ART1 to model new patterns was also a factor--all the knowledge learned is kept in the bottom-up and top-down connection weights. Therefore, the network can directly learn the new patterns, and modify and keep the weights automatically without knowing previous training patterns.
(1)
ART1 Models and Implemented Algorithm
The threshold function f0 can take on several forms such as the ramp, step, or sigmoid function as shown in F i g u r e 1. There are some obvious advantages in the neural network architecture over the conventional algorithm. First, its parallel processing structure greatly increases the speed of computation and solves many problems that traditional methods cannot. Sec-
Two key properties for any neural network used as a learning system are plasticity and stability. Plasticity is the ability of the network to learn something new from the external data (input patterns) and adjust its internal connections. The ability of the network to retain its useful memory of previous stored patterns is called stability. These
283
Journal of Manufacturing Systems Volume 12/No. 4
(d) the bottom-up connection weights, bo(k) = 1 / ( l + n ) where l<-i<--n, l<-j<--m, n is the total number of features of each input pattern, m is the maximum number of groups that can be formed. The vigilance threshold ~/ is set to determine how close an input pattern must be to a stored group in order to match. Step 2: Apply a new input a k = {al k . . . . al k . . . . a,,k}, the vector of features for pattern k, where ai k = {0,1}, l<--i<--n. If the input pattern k possesses feature i, then ai k = 1; otherwise, al k = 0. Step 3: Compute matching scores:
two properties are built into ART models nicely. The ART1 model can learn new facts gradually and will not throw the learned knowledge away quickly. The main idea of the ART1 model 22-23 is presented as follows. Step 1: Initialize the network. Step 2: Input a new pattern. Step 3: If it is the first pattern recognized, store it as the first group in the network and go to Step 2. Else, go to Step 4. Step 4: Choose an active group stored in the network that is most similar to the input pattern. If all stored groups have been made inactive, store the input pattern as a new group and go to Step 2. Else, go to Step 5. Step 5: Test the input pattern to see whether it matches the selected group. If it matches, the input pattern is categorized into the selected group; correspondingly, the selected group is modified by incorporating the input pattern. Then go to Step 2. Else, make this selected group inactive, and go to Step 4. The detailed procedure of the implemented algorithm, given in the following steps, is a modification of L i p p m a n n ' s interpretation of ART1 models. 21 The corresponding topology of the ART1 model is shown in Figure 2. A step-by-step computer implementation of the algorithm is given in Reference 30. Step 1: Initialize the neural system by setting: (a) the input pattern index, k = 1, (b) the vigilance threshold ~/o, 0-----~/o-< 1, (c) the top-down connection weights, to(k) = 1,
n
uj = ~, bij (k) a i* ,
l<-J <-m
(2)
i-I
uj is the jth node (group) of the output layer. Set Flag[j] = 0 (1--
Ilakll = 2E
(3)
i-1
n
IITj" a*ll i-1
If
Ilrj. a II - -
tij" (k) af
(4)
> ~/o,
go to step 7.
--< ~/o,
go to step 6.
Ila*ll If
Ilrj- akll Ila*ll
This step tests the input pattern to see if it is close enough to the best matching group. If it is, go to Step 7; otherwise, go to Step 6. Step 6: Disable the best matching group, j*: Set Hag[j*] = 1 temporarily. This prohibits group j from further participating in the matching processes. Go to Step 3. Step 7: Update the connection weights for the best matching group, j • The input pattern is incorporated into the best matching group, j* by adapting the connection weights using the following equations.
) n
Figure 2 Topology of the ART1 M o d e l
284
Journal of Manufacturing Systems V o l u m e 12/No. 4
(5)
tij" ( k + l ) = tij" (k) aik b#.(k+l)
=
tij" (k ) a~ 0.5 +]~
tij. (k ) a~
Table 1
Input Machine-Part Matrix for Example 1
p~rr,
(6)
machine i 2 3 4 5 6 7 8 9 i0
i-1
Step 8: If there are more input patterns to group, enable all the inactive groups by restoring Flag[j] = 0 (1 <--j<--m), increase input pattern index k = k + 1, and go to Step 2; otherwise, stop. The number of machine cells or part families to be formed is determined by the value of the vigilance parameter. The larger the vigilance parameter, the greater the number of machine cells or part families. By varying the vigilance parameter, the ART1 model provides all the possible number of machine cells or part families.
1 1 O 1 0 O O 1 O O
0 0 1 0 0 1 0 0 O 1
1 0 O 1 0 O O 1 0 0
0 0 O 0 1 0 O 0 1 O
0 0 1 0 0 1 1 0 O 0
1 1 0 1 0 0 O 1 0 0
0 0 O 0 1 O 0 0 1 1
1 0 1 0 0 1 1 0 0 O
Table 2 Results of Machine Grouping for Example 1
Test Results and Discussions The ART1 part family and machine cell forming model has been implemented in Language C and tested with a number of examples taken from the literature. The results of two examples are presented and discussed in the following.
Solution
Th~shold
Numberof
aroup~l
Number
Vigilance
Groups
Machines
I
0.99 to 0.67
5
(1,2) (3,6,7) (4,8) (5,9) (10)
2
0.66 to 0.50
4
Example 1
(1,2,4,8) (3,63) (5,9) (10)
The first example was taken from Reference 31. The machine-part matrix is shown in Table 1. First, data in the machine-part matrix was entered into the ART1 system row by row to find the number of machine groupings. Each row is a binary vector (called machine pattern) to indicate whether a part is processed by a particular machine. By varying the value of vigilance, the number of machine groups that could possibly be formed were obtained. The results are given in Table 2. Secondly, the input data was entered column by column to find the number of part families that could be grouped. The results are provided in Table 3. By rearranging the results obtained from the previous steps, the ART1 system generates a final output matrix, as shown in
3
~ 0.49
3
(1,2,4,8) (3,6,7) (5,9,10)
Table
3
Results of Part Grouping for Example 1 ;olution
Thrl~shold
Number of
Grouped
umber
Vigilance
Groups
Parts
1
> 0.66
6
2
0.65 to 0.40
4
(1,3.6) (2,5) (4,7) (8)
<- 0.39
3
(1,3,6) (2,5,8) (4,7)
Table 4. Compared with the results reported in Reference 31, it was observed that the machines that were classified into three and four groups by both methods are identical. It was also noted that ART1 could not classify machines into one, two, and more than
five groups. This is not an indication that the ART1 approach is not as good as the single linkage clustering method. The ART1 model is better
285
Journal of Manufacturing Systems Volume 12/No. 4
Table 5 Input Operation-Component Matrix for Example 2
Table 4 Output Machine-Part Matrix for Example 1
part
(1
Z~ 6 )
(U)
(2
b)
(4
7)
machine I 2
i i
I 0
i i
i 0
0 0
O 0
0 0
0 0
4 8
i i
i i
i i
0 0
O 0
0 0
0 0
0 0
3 6 7
0 0 0
0 0 0
0 0 0
I i i
i I 0
i i i
0 0 0
0 0 0
iO
0
0
0
0
i
0
0
I
5 9
0 0
0 0
0 0
0 0
0 0
0 0
i i
i i
Component Operation I 2 3 4 5 6 7 8 9 iO li
i 2 3 4 5 6 7 8 9 iO l l l i 000 O 0 O O i O O i O 0 0 I l i l l
l l i l l i l O 0 0 0 1 0 1 Ill 000 0 i O i O 0 0 i O l i O O i O 0 0 1 0 i i O O i O 0 1 O i 1 0 0 i O i i O 0 0 0 0 0 0 O i l l i O 0 i l i l i l i
i 0 i i i i i 0 i i
II
12
i 0 0 0 i i I i 0 0 i
i i 0 0 0 0 i 0 0 I i
comparison is omitted to save space) and make the grouping task easier. They could be removed because operations 1 and 11 are required for every component and operation 9 is required only to machine component 2. The results of the component groups that were formed by the ART1 system is provided in Table 7. By rearranging the results obtained in Tables 6 and 7, the resultant output matrix (Table 8) was obtained.
because it is more capable of faithfully revealing the nature of the machine patterns. The fact that the ART1 could not classify machines into one, two, and more than five groups indicates that the input data should not be grouped into one, two, and more than five cells. This argument can be verified by visually inspecting the rearranged output matrix shown in Table 4. The ART1 system thus generates much more realistic results and is more reliable than the single linkage clustering method. Both methods require an evaluation module to determine the optimal number of groups. With fewer groups formed, the ART1 model also makes the evaluation task for determining the optimal number of part families or machine cells easier.
Table 6 Results of Operation Grouping for Example 2
Solution
Threshold
Number of
Grouped
Number
Vigilance
Groups
Operations
I
0.57 to 0.38
5
(1,2,7,9) (3,4) (5,8)
(6) (10,11) 2
g 0.37
4
(1,2,7,9) (3,4,10) (5,8) (6,11)
Example 2 The second example, the operation-component matrix shown in Table 5, was taken from Reference 32. Entering input data row by row, the ART1 system formed possible operation groups as shown in Table 6. When operations 1, 9, 10, and 11 were removed from the results (as done in Reference 32), the number of operation groups that could be formed is 4, and the operations classified into each individual group are exactly the same as reported in Reference 32. In classifying components for this example, operations 1, 9, and 11 were excluded from the input data before grouping to improve the results (the
Table 7 Results of Component Grouping for Example 2
Solution Number
Threshold Vigilance
Number of Groups
Grouped Compon©nts
1
0.49 to 0.34
5
(1,2,7,12) (3,8) (4,5) (6,10) (9,11)
2
0.33 to 0.25
4
(1,2,7,12) (3,6,8)
3
0.24 to 0.17
3
4
g 0.16
2
(4,5,10) (9,1I) (1,2,7,12)
(3,6,8,9,11) (4,5,10) (1.2,4,5.7,10.12) (3,6,8,9,11)
286
Journal of Manufacturing Systems Volume 12/No. 4
Table 8
The output matrix generated by the ART1 system is a helpful visual aid to comprehend the number of groups that could be formed. This is made possible because of the capability of the ART1 system to self-organize operations and components based on their similarities and internal relationships.
OutputOperation-ComponentMatrixfor Example2 (I
Component O p e r a ti on 1 2 7 9
Integration of Feature-Based CAD with the ART1 Model The ART1 model has been integrated with a feature-based CAD module for automatic GT coding and part family forming 33 (Figure 3). After the user completes a part design using the design procedure and the feature library, the features to be removed for machining a part are automatically generated by the feature-based CAD module. A binary vector is generated for the part after the design process. Mathematically, the binary code for a part is represented by an n- component vector (fl, f2 . . . . . f , ) , where n is the total number of features in the pre-established machining feature library. The value off~ is 1 if the part has feature i, 0 if not. The ART1 model takes the binary vectors as inputs and forms part families according to the similarities of machining features needed to be removed. All parts in a GT family are assigned a GT code according to a customized GT coding scheme that is developed based on the feature library residing in the featurebased CAD module. There are two stages involved in using the ART1 system for automatic GT coding and part family forming--development and operating. The development stage deals with the formation of initial GT part families using a collection of existing parts. The operating stage, on the other hand, classifies new parts into an existing or a new GT part family. An example given in Reference 33 demonstrates how the system can be used to design parts, form parts into families, and assign GT code to parts. Using the feature-based CAD module (with 26 pre-established features) to design 17 parts, a set of 17- 1 x 26 binary vectors were obtained. Table 9 shows the part-feature matrix for this example. These binary vectors were entered into the ART1 model to form initial GT part families. The number of part families that could possibly be formed by the ART1 model is given in Table 10. The optimal
2 7 12 ) (4
5)
(I0)
(6)
(3
1 1 0 0
1 1 1 1
1 1 1 0
1 1 1 0
1 0 1 0
1 0 0 0
1 0 1 0
1 0 0 13
1 13 0 0
6
1
0 1
0
0
1
1
13
0
11
1
1 1
i
1
1
1
1
1
10
1
1 1
1
1
1
1
1
0
3
0
0 0
0
l
0 0
0
1 1
1
0
0
i
1 1
0 0
0 0
0 0 0 0
0 0
0 0
1 0
0
1 1
1 1
5 8
1
number of part families can be determined based on some company-dependent criteria. Based on the result, the vigilance parameter for the ART1 model could then be found. Once the initial part families are formed, the system can easily be used to classify new parts into part families.
Feature-based CAD system
I use~
I
Pzocedural design module
Binary
vec~ ART1 based GT coding & forming Coding scheme customization module
Classification Module
J
1
,t
GT codes & part families
Figure 3
AutomaticGTCodingand PartFamilyFormingSystem
287
Journal of Manufacturing Systems V o l u m e 12/No. 4
Table
9
Input Part-Feature Matrix for Part Family Forming Part Feature I 2 3 q 5 6 7 8 9 I0 11 12 13 14
15 16 17 18 i9 20 21 22 23 2q 25 26
i 2 3 4 5 6 7 8 9 I0 I O O 0 i i 0 I 0 0 i 0 1 0 0 i 0 0 0 0 O 0 i 0 0 0
0 i O I 0 i 0 0 ± O i O i O i l l 0 0 0 0 1 0 1 1 i 0 0 0 O0 0 0 0 0 I 0 ± 010 0 0 I OiO I 0 0 0 0 000 0 0 0 Oi 0 1 0 1 0 0 i 0 i 0 i 0 i i 0 0 0 0 i i O I 0 I Oili Oi 0 0 000 0 0 0 0 0 0 1 0 1 0 0 0 0 000 0 0 0 0 0 000 0 0 0 0 0 000 0 0 0 0 0 0 0 0 0 O0 0 0 0 0 0 0 0 0 0 O0 0 0 0 O0 i 0 i OiOi i i O i O i O i i 0 0 0 0 0 0 0 0 O 0 0 i O i O 0 0 0 0 0 00 0 0 0 0 0 0 0 0 0 0 I 0 I 0 00 0 0
i
0 i i 0 1 i 0 i 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Ii
IZ
13
i4
0 I i 0 0 0 0 0 i 0 i 0 0 0 0 0 0 0 0 i i 0 0 0 0 0
0 I i 0 0 0 0 0 i 0 i 0 0 0 0 0 0 0 0 i i 0 0 0 0 I
i 0 0 0 I I 0 i o 0 i i 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 i i 0 0 0 0 0 i i i 0 0 0 0 0 0 0 0 i i 0 0 0 0 i
Integer Programming Model to select part zoutings that minimize opelating cost
ART1 Model to form machine cells
STORM Plant Layout Model to arrange cells and machines such that the total material handling cost is minized.
Table 10
Possible Part Grouping for the Part Family Forming Problem Solution Number
Threshold Vigilance
Number of
Groups
Grouped
Parts
I
0.79 to 0.75
7
(1,3,13) (2,4,8) (5,7,16) (6,9,11,12) (10) (14,15) (17)
2
0.74 to 0.72
6
(1,3,13) (2,4,8) (5,7,16) (6,9,11,12,14,15) (10) (17)
3
0.71 to 0.63
5
(1,3,5,10,13) (2,4,8) (6,9,11,12,14,15) (7,16) (17)
4
0.62 to 0.38
3
(1,3,5,7,10,13,16) (2,4,8) (6,9,11,12,14,15,17)
5
-< 0.37
2
(1,3,5,7,10,13,16) (2,4,6,8,9,1132, 14,15,17)
Figure 4 Three-Stage Procedure for Designing
Cellular
Manufacturing Systems
from the first stage form a 0/1 part-machine matrix. The binary matrix is then used by the ART1 model for forming machine cells at the second stage. Considering the operation sequence, the STORM plant layout model (which implements a modified steepest descent pairwise interchange method) finally determines the optimal layout that minimizes the total material handling costs. An example involving nine parts and six machines was given to test the procedure in Reference 34. They are partially shown here to illustrate how the ART1 model was incorporated into the procedure. The resultant part-machine binary matrix after executing the first stage of the procedure is given in Table 11. The number of machine cells that could possibly be formed by the ART1 system, along with the bottleneck parts, are provided in Table 12. The bottleneck parts are the parts that need to be machined across cells in this case. Refer to Reference 34 for the details of the procedure.
Three-Stage Procedure for CMS Design A three-stage procedure, as shown in Figure 4, has been developed for designing cellular manufacturing systems with the objectives of minimizing operating and material handling costs. 34 The first stage formulates the integer programming model for determining the best part routing that minimizes the operating cost from all the alternatives. The results
288
Journal of Manufacturing Systems Volume 12/No. 4
Table 12 Possible Machine Grouping for the CMS Design Problem
Table 11 Part-Machine Matrix Obtained from the First Stage of the Three-Stage Design Procedure
Number
k=l
2
3
4
5
6
7
8
9
Vigilance of cells
m=l 2 3
1
1
1
1
4
1
5 6
1 1
1
1
Bottleneck Parts
Machines
parts
0.01
2
1,3,5,6,7,8,9 1,2,3,4
1,3,4,5 2,6
1,3
0.02
3
3,7,9 1,2,3,4 1,5,6,7,8
1,5 2,6 3,4
1,3,7
0.33
4
9 1,2,3,4 1,5,6,7,8 3,7
1 2,6 3,4 5
1,3,7
0.49
5
9 2,4 1,5,6,7,8 3,7 1,2,3
1 2 3,4 5 6
1,3,7
1 1
1
1
References
Summary
1. M.P. Groover, Automation, Production, and ComputerIntegrated Manufacturing, Prentice-Hall, Inc., Englewood Cliffs, NJ,
In this article, we report our evaluations of the ART1 neural models for GT part family and machine cell forming. We first implemented an ART1 neural model in C and then tested it with examples taken from the literature. Encouraged by the initial results, we next integrated the ART1 model with a feature-based design system for automatic GT coding and part family forming. It was finally incorporated into a three-stage procedure for designing cellular manufacturing systems. Our investigations revealed that the ART1 model has several advantages over traditional GT algorithms including computation speed, fault tolerance and associative capability, learning ability, etc. The selection of a proper vigilance threshold is, however, very crucial. The higher the threshold, the more number of groups will be formed. An optimal threshold must be found for a practical application. ART1 doesn't tell us how many groups should be formed properly. An evaluation module is needed to find the optimal number of part families or machine cells based on some criteria. The three-stage procedure for designing cellular manufacturing systems demonstrates how the ART1 model works together with an evaluation model.
1990.
2. M.R. Henderson and S. Musti, "Automated Group Technology Part Coding From a Three-Dimensional CAD Database," Journal of Engineering for Industry, Vol. 110, August 1988, pp. 278-87. 3. A.H. Bond and R. Jain, "The Formal Definition and Automatic Extraction of Group Technology Codes," Proceedings of the 1988 ASME International Computers in Engineering Conference, July 1988, San Francisco, pp. 537-42. 4. J.J. Shah and A. Bhatnagar, "Automatic Group Technology Classification From Form Feature Models," NAMRC XVI, 1988, pp.
365 -70. 5. S. Kaparthi and N.C. Suresh, "A Neural Network System for Shape-Based Classification and Coding of Rotational Parts," International Journal of Production Research, Vol. 29, No. 9, pp. 1771-84. 6. J. McAuley, "Machine Grouping for Efficient Production," The Production Engineer, February, pp. 53-57. 7. A.H. Chan and D.B. Milner, "Direct Clustering Algorithm for Group Formation in Cellular Manufacturing," Journal of Manufacturing Systems, Vol. 1, No. I, pp. 65-74. 8. H. Seifoddini and P.M. Wolfe, "Application of the Similarity Coefficient Method in Group Technology," liE Transactions, Vol. 18, No. 3, pp. 271-77.
9. H. Seifoffini, "Single Linkage Versus Average Linkage Clustering in Machine Cells Formation Applications," Computers and Industrial Engineering, Vol. 16, No. 3, pp. 419-26. 10: S.K. Khator and S.A. lrani, "Cell Formation in Group Technology: A New Approach," Computers and Industrial Engineering, Vol. 12, No. 2, 1987, pp. 131-42. 11. J.R. King, "Machine Component Grouping in Production Flow Analysis: An Approach Using a Rank Order Clustering Algorithm," International Journal of Production Research, Vol. 18, No. 2, pp. 1287-1304. 12. J.R. King and V. Nakornchai, "An Interactive Data-Clustering Algorithm," Flexible Manufacturing Systems: Methods and Studies, A. Kusiak, editor, Elsevier Science Publishers, North-Holland, 1986,
pp. 285-90.
289
Journal of Manufacturing Systems Volume 12/No. 4
13. T. Vohra, D.S. Chen, J.C. Chang, and H.C. Chen, " A Network Approach to Cell Formation in Cellular Manufacturing," International Journal of Production Research, Vol. 28, No. 11, pp. 2075-84. 14. Z. Faber and M.W. Carter, " A New Graph Theory Approach for Forming Machine Cells in Cellular Production Systems," Flexible Manufacturing Systems: Methods and Studies, A. Kusiak, editor, Elsevier Science Publishers, North-Holland, 1986, pp.301-15. 15. W.J. Boe and C.H. Cheng, " A Close Neighbour Algorithm for Designing Cellular Manufacturing Systems," International Journal of Production Research, Vol. 29, No. 10, pp. 2097-2116. 16. R.G. Askin, S.H. Cresswell, J.B. Goldberg, and A.J. Vakharia, " A Hamiltonian Path Approach to Reordering the Part-Machine Matrix for Cellular Manufacturing," International Journal of Production Research. 17. F.F. Boctor, " A Linear Formulation of the Machine-Part Cell Formation Problem," International Journal of Production Research, Vol. 29, No. 2, pp. 343-56. 18. G. Carpenter and S. Grossberg, "Adaptive Resonance Theory: Stable Self-Organization of Neural Recognition Codes in Response to Arbitrary Lists of Input Patterns," Eighth Annual Conference of the Cognitive Science Society, Hillsdale, NJ, pp. 45-62. 19. G. Carpenter and S. Grossberg, "Absolutely Stable Learning of Recognition Codes by a Self-Organizing Neural Network," AIP Conference Proceedings 151: Neural Networks for Computing, J. Denker (Ed.), NY, pp. 77-85. 20. G. Carpenter and S. Grossberg, "Associative Learning, Adaptive Pattern Recognition, and Cooperative Decision Making by Neural Network," Proceedings of the SPIE, pp. 218-47. 21. R.P. Lippmann, "An Introduction to Computing With Neural Nets", IEEE ASSP MAGAZINE, April 1987, pp. 11-13. 22. G. Carpenter and S. Grossberg, "The ART Of Adaptive Pattern Recognition By A Self-Organizing Neural Network," IEEE Computer, March 1988. 23. G. Carpenter and S. Grossberg, " A Massively Parallel Architecture For A Self-Organizing Neural Pattern Recognition Machine," Computer Vision, Graphics and Image Processing, Vol. 37, 1987, pp. 54-115. 24. R. Hecht-Nielsen, "Applications of Counterpropagation Networks", Neural Networks, Vol. 1, pp. 131-40. 25. J.J. Hopfield, "Neural Networks and Physical Systems with Emergent Collective Computational Abilities," Proceedings of the National Academy of Science, April 1982, pp. 2554-58.
26. J.J. Hopfield, "Neurons with Graded Response Have Collective Computational Properties like Those of Two-state Neurons," Proceedings of the National Academy of Science, May 1984, pp. 3088-92. 27. J.J. Hopfield and D.W. Tank, "Computing with Neural Circuits: A Model," Science, August 1986, pp. 625-33. 28. D.E. Rumelhart, G.E. Hinton, and R.J. Williams, "Learning Internal Representations by Error Propagation," Parallel Distributed Processing: Explorations in the Microstructure of Cognition, D.E. Rumelhart & J.L. McClelland (Eds.), MIT Press, 1986. 29. T. Kohonen, Self-Organization and Associative Memory, Springer-Verlag, Berlin, 1984. 30. A. Kusiak and Y. Chung, "GT/ART: Using Neural Networks To Form Machine Cells," Manufacturing Review, Vol. 4, No. 4, December 1991, pp. 293-301. 31. I. Ham, K. Hitomi, and T. Yoshida, Group Technology, Kluwer Nijhoff Publishing, Boston MA, 1985. 32. C.C.Gallagher and W.A. Knight, Group Technology: Production Methods in Manufacturing, John Wiley & Sons, NY, 1986. 33. T.W. Liao and K.S. Lee, "Integration of a Feature-Based CAD System and an ARTI Neural Model for GT Coding and Part Family Forming," Computers & Industrial Engineering (in press). 34. T.W. Liao, "Design of Cellular Manufacturing Systems for Minimum Operating and Total Material Handling Costs," International Journal of Production Research (in press).
Author Biographies T. Warren Liao is an Assistant Professor of Industrial & Manufacturing Systems Engineering at Louisiana State University. His current research interests are in the area of metal cutting (particularly grinding), CAD/CAM integration, design of cellular manufacturing systems, intelligent manufacturing, and mechatronics. Liao received his MS and PhD in industrial engineering from Lehigh University. He is a member of SME, IIE, ASME, and IEEE. LianJiang Chen is a graduate student of the Industrial & Manufacturing Systems Engineering Department at Louisiana State University. He is a graduate of Fudan University, China (BS in electrical engineering). His current research interest is neural networks, AI, and their applications.
290