Minimizing mean flowtime in a two-machine flowshop with sequence-independent setup times

Minimizing mean flowtime in a two-machine flowshop with sequence-independent setup times

Computers & Operations Research 27 (2000) 111}127 Minimizing mean #owtime in a two-machine #owshop with sequence-independent setup times Ali Allahver...

139KB Sizes 20 Downloads 130 Views

Computers & Operations Research 27 (2000) 111}127

Minimizing mean #owtime in a two-machine #owshop with sequence-independent setup times Ali Allahverdi* Department of Mechanical and Industrial Engineering, College of Engineering and Petroleum, Kuwait University, P.O. Box 5969, 13060 Safat, Kuwait Received August 1997; received in revised form October 1998

Abstract This paper addresses the two-machine #owshop problem to minimize mean #owtime where setup times are separated and sequence independent. Optimal solutions are obtained for two special cases. For the general case, two dominance relations are established and their e!ectiveness in a branch-and-bound algorithm is evaluated. It is shown that problems up to 35 jobs can be solved optimally in a reasonable time. Moreover, for the general case, three heuristic algorithms are proposed to "nd an approximate solution for larger problems, and they are empirically evaluated to assess their e!ectiveness in "nding the optimal solution. Computational results show that one of the heuristic algorithms has an overall average error of 0.7% from the optimal value and that the error is independent of the number of jobs. Scope and purpose A #owshop scheduling problem involves scheduling a number of jobs on a number of machines in order to optimize a given criterion. The majority of research assumes that setup times are negligible or can be combined with the processing times. However, the latter assumption is invalid in the case of sequencedependent setup times and may lead to more idle time on some of the machines in the case of sequenceindependent setup times. This illustrates the importance of considering setup times as separate from processing times. In the literature, the separate setup times problem has been mainly addressed with the criterion of makespan. Minimizing makespan is important in situations where a simultaneously received batch of jobs is required to be completed as soon as possible. For example, a multi-item order submitted by a single customer that needs to be delivered at the minimal possible time. However, there are other real life situations in which each completed job is needed as soon as it is processed. In such situations, one is interested in minimizing the mean or sum of #ow (completion) times of all jobs rather than minimizing makespan. This objective is particularly important in real life situations in which reducing inventory or

*Tel.: 00 965 481 7381; fax: 00 965 484 7131. E-mail address: [email protected] (A. Allahverdi) 0305-0548/00/$ - see front matter ( 2000 Elsevier Science Ltd. All rights reserved. PII: S 0 3 0 5 - 0 5 4 8 ( 9 9 ) 0 0 0 1 0 - 6

112

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

holding cost is of primary concern. This paper addresses a two-machine #owshop scheduling problem with separate setup times in order to minimize mean #owtime. ( 2000 Elsevier Science Ltd. All rights reserved. Keywords: Scheduling; Flowshop; Mean #owtime; Separate setup time; Heuristic

1. Introduction Johnson [1] established an optimal solution for the classical two-machine #owshop problem when the objective is to minimize the completion time of the last job } the makespan criterion. Among the assumptions he made is that the setup time of a job is included in the processing time of the job on each machine. While this assumption may be justi"ed for some scheduling problems, other situations call for explicit (separate) setup time consideration. In a paper bag factory, a setup time is needed for the machine to switch between the types of paper bags, and the setup duration depends on the degree of similarity between consecutive batches (e.g., size and number of colors). Therefore, the setup time cannot be considered as part of the processing time, Pinedo [2]. Another application of separate setup can be found in the printing industry where the machine cleaning (setup) time depends on the color of the order, Conway et al. [3]. Similar practical situations arise in the chemical, pharmaceutical, food processing, metal processing, and semiconductor industries [4}8]. For a separate setup time, two problem types exist. In the "rst, setup time depends only on the job to be processed, hence called sequence independent. Whereas, in the second, setup time depends on both the job to be processed and the previous job, hence called sequence dependent. If there exists some idle time on the second machine, which is usually the case, then the setup time for a job on the second machine can be performed prior to the completion time of the job on the "rst machine [9], see Fig. 1. This means that a regular performance measure may be improved by considering setup times as separate from processing times. There has been extensive research on the variant of the problem Johnson introduced with the makespan objective function. Yoshida and Hitomi [10] extended Johnson's two-machine problem to the case where setup times are separated from processing times and sequence independent. Allahverdi [11] extended the work of Yoshida and Hitomi to stochastic environments. For some environments, removal times also need to be explicitly considered where removal times include the

Fig. 1. Gantt chart for the example.

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

113

times for activities such as dismantling the jigs, sharpening the tools or cleaning the machine area. Sule and Huang [9] extended the problem of Yoshida and Hitomi [10] by considering setup and removal times as separate and sequence independent. The work of Sule and Huang is extended to stochastic environments by Allahverdi [12]. There are many other papers on multiple-machine #owshops with a makespan objective function and separate setup times including Khurana and Bagga [13, 14], Rajendran and Ziegler [15], Szwarc [16], and Szwarc and Gupta [17]. A recent survey on scheduling problems involving setup times is given by Allahverdi et al. [18]. Dileepan and Sen [19] addressed the two-machine sequence-independent and separate setup time problem with the performance measure of maximum lateness. They presented dominance relations along with a lower bound to be used in a branch-and-bound algorithm. They also developed two heuristic algorithms. Allahverdi and Aldowaisan [20] extended Dileepan and Sen's problem to include removal times in addition to setup times. They established dominance relations and obtained optimal solutions for some special #owshops. The two-machine #owshop problem was shown to be NP-hard [21] when the objective is to minimize total (mean) completion time instead of makespan even for the case where setup times are neglected. This means that it is highly unlikely to "nd a polynomial algorithm to solve the problem. Therefore, researchers concentrated on developing branch-and-bound or heuristic algorithms. Della Croce et al. [22] presented a recent branch-and-bound and a heuristic algorithm for the problem without setup times. Bagga and Khurana [23] addressed the two-machine sequence-independent and separate setup time problem with mean completion time criterion. They developed one dominance relation and a lower bound for the problem. They observed a 10}50% reduction in the creation of nodes in the branch-and-bound algorithm while solving problems with the number of jobs ranging from 5 to 9. The #owtime and completion time criterion are equivalent when the jobs are ready at time zero. Srikar and Ghosh [4] modeled #owshops with sequence-dependent setup times and mean #owtime performance criterion as a mixed integer linear program (MILP) and showed that problems with 6 jobs and machines can be solved optimally. Later, Sta!ord and Tseng [24] corrected the model of Srikar and Ghosh [4]. The present paper addresses the same problem of Bagga and Khurana [23]. Two more dominance relations are established for the problem. These relations along with Bagga and Khurana's [23] relation are used in a branch-and-bound algorithm. The optimal solutions for two special #owshops are obtained and three heuristic algorithms are presented. The e$ciencies of the branch-and-bound and heuristic algorithms are empirically evaluated. The problem is formulated in Section 2, and two dominance relations are provided in Section 3. In Section 4, two optimum sequences are established for special #owshops while two heuristic algorithms are presented in Section 5. The e!ectiveness of the dominance relations and heuristic algorithms is tested in Section 6, and possible extensions are discussed in Section 7.

2. Formulation Let t be the processing time of job j ( j"1, 2, 2 , n) on machine m (m"1, 2), t the j,m *j,m+ processing time of the job in position j on machine m, s the setup time of job j on machine m, j,m

114

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

s the setup time of the job in position j on machine m, TFT the total #owtime, and MFT the *j,m+ mean #owtime. Also, let ST denote the sum of the setup and processing times of the jobs in positions 1, 2, 2 , j j,k on machine k, i.e., j ST " + (s #t ), j"1, 2, 2 , n and k"1, 2. j,k *r,k+ *r,k+ r/1 By Eq. (2) of Yoshida and Hitomi [10], the completion time of the job in position j (C ) is given j by (in our notation),

C

D

j u u~1 C " max + (s !s #t )! + t # + (s #t ). j *i,2+ *i,2+ *i,1+ *i,2+ *i,1+ *i,2+ 0xuxj i/1 i/1 i/1 This equation can be written as

C

D

j u u~1 # + (s #t ) C " max + (s #t )! + (s #t )!s *i,2+ *i,2+ *i,1+ *i,1+ *i,2+ *i,2+ *u,2+ j 0xuxj i/1 i/1 i/1 " max [ST !(ST #s )]#ST . u,1 u~1,2 *u,2+ j,2 0xuxj

(1)

Let TI "max M0, d , d , 2 , d N, j 1 2 j where d "ST !(ST #s ), j"1, 2, 2 , n j j,1 j~1,2 *j,2+

(2)

(3)

with ST "0. 0,2 Then, Eq. (1) becomes C "ST #TI . j j,2 j

(4)

Once the completion times of the jobs are known, then, n TFT" + C j j/1

(5)

and n MFT"(1/n) + C . (6) j j/1 Observe that minimizing TFT is equivalent to minimizing MFT. To clarify some of the concepts covered in the problem formulation, an example of three jobs is given below. Table 1 shows the setup and processing times of the three jobs on both machines.

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

115

Table 1 Job processing and setup times for the example Job

1 2 3

Machine 1

Machine 2

s i,1

t

2 1 3

4 5 4

i,1

s i,2

t

3 4 2

3 2 8

i,2

If the jobs are sequenced as 1, 2, 3, then ST "2#4"6, ST "2#4#1#5"12, 1,1 2,1 ST "2#4#1#5#3#4"19, ST "3#3"6, ST "3#3#4#2"12, ST " 3,1 1,2 2,2 3,2 3#3#4#2#2#8"22. Also d "ST !3"3, d "ST !(ST #4)"2, d "ST 1 1,1 2 2,1 1,2 3 3,1 !(ST #2)"5. Therefore, TI "maxM0, 3N"3, TI "maxM0, 3, 2N"3, and TI "maxM0, 3, 2, 5N 2,2 1 2 3 "5. Hence, the completion times are C "6#3"9, C "12#3"15, and C "22#5"27 1 2 3 which yields the total #owtime of TFT"C #C #C "51. 1 2 3 3. Dominance relations and lower bounds Dominance relations can be classi"ed as either local or global. A local dominance relation states that job i should precede job k in an optimal schedule when the jobs are adjacent. For a global dominance relation, however, the jobs need not be adjacent. Therefore, Global dominance relations are more general. Dominance relations (local and global) are common in the scheduling literature, e.g., see Chu [25], Daniels and Chambers [26], Allahverdi [11]. Bagga and Khurana [23] established a global dominance relation for our problem. They showed that job i should precede job k in an optimal sequence if the jobs satisfy minMs #t !s , i,1 i,1 i,2 t N)minMs #t !s , t N, s #t )s #t , s #t )s #t , and t *t . In k,2 k,1 k,1 k,2 i,2 i,1 i,1 k,1 k,1 i,2 i,2 k,2 k,2 k,2 i,2 the following two theorems, we establish two more dominance relations. Theorem 1. Suppose that jobs i and k satisfy s #t #s )s #t #s , s #t ) i,1 i,1 k,2 k,1 k,1 i,2 i,2 i,2 s #t and t )t . ¹hen, in a sequence that minimizes the mean -owtime, job i should precede k,2 k,2 k,2 i,2 job k. Proof. See Appendix A. K Theorem 2. In a sequence where jobs i and k are adjacent, in order to minimize mean -owtime job i should precede job k if s #t #s )s #t #s , s #t )s #t , and i,1 i,1 k,2 k,1 k,1 i,2 i,1 i,1 i,2 i,2 s #t )s #t . i,2 i,2 k,2 k,2 Proof. See Appendix B. K

116

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

The dominance relation given in Theorem 1 is global while the one given in Theorem 2 is local. Since dominance relations (global or local), in general, do not yield optimal schedules, after the establishment of these relations, researchers often use implicit enumeration techniques such as branch-and-bound to get an optimal schedule. Bagga and Khurana [23] developed two lower bounds on TFT. The following is the same lower bound with a slightly di!erent representation. Let a represent the set of jobs already assigned (assume l jobs are assigned) and b represent the set of jobs to be assigned (n!l). Also let D "s #t , and D "s #t . For each i3b, i,1 i,1 i,1 *i,1+ *i,1+ *i,1+ compute D and let D*i,1+ denote the ith smallest D . It is obvious that there will be no idle time on i,1 i,1 the "rst machine, and even if the jobs with the smallest setup and processing times are placed in earlier positions (for positions after l), then for j'l, u j~l 0 C * + D # + D*r,1+#t , where + ( . )"0. j *i,1+ *j,2+ i/1 r/1 r/1 Then, a lower bound on TFT can be established as l n~l n TFT* + (n!i#1)D # + (n!l!r#1) D*r,1+# + t . *i,1+ i,2 i/1 r/1 i/1 Let D "s #t . Compute D for each i3b. Then, let D*i,2+ denote the ith smallest D . Now i,2 i,2 i,2 i,2 i,2 even if no idle time exists on the second machine after the job in position l is completed and the jobs with the smallest setup and processing times are placed in earlier positions (for positions after l), then for j'l

j~l 0 C *C # + D*i,2+, where + (.)"0. j l i/1 i/1 i|b Thus, another lower bound on TFT is l n~l 0 TFT* + (n!i#1)[C !C ]# + (n!l!r#1)D*r,2+, where + (.)"0. i i~1 i/1 r/1 r/1 Hence,

G

l n~l n + (n!i#1)D # + (n!l!r#1)D*r,1+# + t , *i,1+ i,2 i/1 r/1 i/1 l n~l + (n!i#1)[C !C ]# + (n!l!r#1)D*r,2+ . i i~1 i/1 r/1 Given this lower bound, the branch-and-bound algorithm can be applied in the usual way. The e$ciency of the branch-and-bound algorithm by using this lower bound along with the two dominance relations developed in this paper (Theorems 1 and 2) and Bagga and Khurana's [23] relation is evaluated in Section 6. For a given problem of n jobs, in a branch-and-bound search, (n!1)! branches will be eliminated if a single pair satis"es a local dominance relation while n!/2 branches will be eliminated if a single pair satis"es a global dominance relation. More branches will

TFT*max

H

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

117

be eliminated if more than one pair satisfy the relations. In general, dominance relations can signi"cantly improve the e$ciency of a branch-and-bound algorithm by constraining the search space, Della Croce et al. [22].

4. Optimum sequences For the generic problem, it is highly unlikely to "nd a polynomial solution since the problem is known to be NP-hard even for the case where the setup times are ignored, Gonzalez and Sahni [21]. In what follows, it is shown that our problem is solvable for two special cases speci"ed in Theorems 3 and 4. Theorem 3. If s #t )s for all j, then sequencing the jobs in nondecreasing order of s #t j,1 j,1 j,2 j,2 j,2 minimizes the mean -owtime. Proof. Since s #t )s for all j, it follows from Eqs. (2) and (3) that TI "0. Therefore, from j,1 j,1 j,2 j Eq. (4), C "ST . Thus, j j,2 n n j n TFT" + ST " + + (s #t )" + (n!j#1)(s #t ). j,2 *i,2+ *i,2+ *j,2+ *j,2+ j/1 j/1 i/1 j/1 It is obvious from this equation that TFT is minimized if jobs are sequenced in nondecreasing order of s #t . i,2 i,2 Observe that if the condition sated in the theorem is satis"ed then there will be no idle time on the second machine, and hence, this machine would be dominant. Therefore, it is expected that sequencing the jobs in nondecreasing order of setup and processing times on the second machine would minimize the mean #owtime. Smith et al. [27] de"ned a subcategory of the classical #owshop problem where the processing times of the jobs have the following two properties: (i) if one job has a smaller processing time than another job on any machine, then this relationship will hold on all other machines, (ii) for k"1, 2, 2 , m the machine with the kth smallest processing time for a particular job will also have the kth smallest processing time for all other jobs. A #owshop with these two characteristics is called an ordered #owshop. Smith et al. discussed the practical basis for ordered #owshop problems and pointed out that in many industries the ordered property arises when the nature of the operations on each job is very similar but the jobs di!er in the amount of production. Other cases are when the job requires simple operations on each unit while another job requires time-consuming operations, or when the machine performs complex operations on all jobs while another machine performs simple operations on all jobs. These characteristics were found to be applicable to the majority of surveyed plants by Panwalkar et al. [28]. A #owshop with only property (i) is called a semi-ordered #owshop and such a #owshop is practical since the more restrictive ordered #owshop is shown to be practical. A situation in which a semi-ordered #owshop exists is when the number of units in a job is larger than that of another job consisting of the same units. An ordered or semi-ordered #owshop where the largest processing time for every job occurs on the "rst machine is called type A. Similarly, if the largest processing time for every job occurs on the last

118

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

machine it is called type B. Panwalkar and Khan [29] showed that in an m-machine ordered #owshop, the shortest processing time order minimizes mean #owtime provided that setup times are included in processing times. This de"nition of ordered or semi-ordered #owshops applies when setup times are negligible or they are included in processing times. For the separate setup time case, we de"ne ordered #owshops (similar to the nonseparate case) as #owshops with the following properties: (i) if one job has a smaller setup plus processing time than another job on any machine, then this relationship will hold on all other machines, (ii) for k"1, 2, 2 , m the machine with the kth smallest setup plus processing time for a particular job will also have the kth smallest setup plus processing time for all other jobs. We call an ordered or semi-ordered #owshop where the largest setup plus processing time for every job occurs on the "rst machine as type A. If the largest setup plus processing time for every job occurs on the last machine, we call it type B. The following theorem shows, under certain conditions, how to obtain an optimum sequence for a two-machine semi-ordered #owshop of Type B. Note that by property (i) for a two-machine semi-ordered #owshop, s #t )s #t implies s #t )s #t for all j and h. Furj,1 j,1 h,1 h,1 j,2 j,2 h,2 h,2 thermore, when it is of type B, then, s #t )s #t for all j. j,1 j,1 j,2 j,2 Theorem 4. Consider a -owshop such that s #t )s #t implies s *s for all j and h. j,1 j,1 h,1 h,1 j,2 h,2 ¹hen, for such a two-machine semi-ordered -owshop of type B, sequencing the jobs in nondecreasing order of s #t minimizes the mean -owtime. j,1 j,1 Proof. See Appendix B. K 5. Heuristic algorithms Three heuristic algorithms are given in this section. Heuristic HA1 assigns one job at a time, heuristic HA2 assigns two jobs at a time while heuristic HA3 assigns three jobs at a time. Heuristic HA1 Step 1: Let p "Mall jobsN, p "/ 1 2 Step 2: Choose job i3p with the minimum of max Mu #t , u N#t 1 i,1 i,1 i,2 i,2 Remove this job from p and place it in the "rst position of p . Set k"2 1 2 Step 3: If k"n, Go to Step 5, else choose job i3p with the minimum 1 max Mg (p )#s #t , C (p )#s N#t *k~1,1+ 2 i,1 i,1 k~1 2 i,2 i,2 remove this job from p and place it in position k of p . 1 2 Let k"k#1. Step 4: Go to Step 3. Step 5: Assign the last remaining job in p to the last position of p . The sequence p is the heuristic 1 2 2 sequence. Heuristic HA2 Step 1: Let n "Mall jobsN, n "/, and set k"3 1 2

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

119

Step 2: Compare all possible ordered pairs of jobs from n and choose the pair for the "rst two 1 positions of n such that the sum of completion time of the two jobs is minimum. Remove the 2 selected pair from n and place it in the "rst two positions of n . 1 2 Step 3: If k"n, go to Step 5, if k"n#1, go to Step 6, else Given that the "rst k!1 jobs are assigned in sequence n , choose the next ordered pair 2 from n such that the sum of the job completion time for the two new jobs for positions k and 1 k#1 of sequence n is minimum. Remove the selected pair from n and place it in positions 2 1 k and k#1 of n . 2 Let k"k#2. Step 4: Go to Step 3. Step 5: Assign the last remaining job in n to the last position of n . 1 2 Step 6: The sequence n is the heuristic sequence. 2 Heuristic HA3 This heuristic is similar to HA2 except that at Steps 2 and 3 three jobs are assigned at a time as opposed to two jobs at a time.

6. Computational results Bagga and Khurana [23] only evaluated the e!ectiveness of the dominance relation they developed, for jobs ranging from 5 to 9. In this section, we evaluate the e!ectiveness of the branch-andbound algorithm using their lower bound and dominance relation (GDR2) along with the two dominance relations (GDR1(Theorem 1) and LDR(Theorem 2)) developed in this paper for large number of jobs. Using a C-Programming code on a Sun Sparc 10, the branch-and-bound algorithm is evaluated with respect to average CPU time in seconds and the number of nodes searched. The three dominance relations are evaluated with respect to the average frequency where the average frequency is determined by counting the number of instances where each relationship is satis"ed by the number of replicates. The selected measure of e!ectiveness for the heuristic algorithms is the average error from the optimum mean #owtime where the error is de"ned as: Error"(Heuristic mean -owtime!Optimum mean -owtime)/Optimum mean -owtime. This measure has been used to evaluate the e!ectiveness of heuristics in the literature, e.g., Gupta and Darrow [30, 31], and Gupta [32]. The average heuristic errors and the average CPU times for heuristics HA1, HA2, and HA3 are compared. The experiment was conducted for 10, 15, 20, 25, 30, and 35 jobs. For 10, 15, and 20 jobs 50 replicates, for 25 jobs 30 replicates, for 30 jobs 20 replicates and for 35 jobs 10 replicates are taken. The setup and processing times were randomly generated from a bivariate uniform distribution with t from [1, 100] and s from [0, 100k], where k is the expected ratio of setup to processing i,j i,j time (s /t ). The experiment was conducted for k values of 0.25, 0.5, 0.75, and 1.0 (for 35 jobs, the i,j i,j experiment was only conducted for k"0.5). The chosen uniform distribution with a large data spread is recommended by Hall and Posner [33] and has often been used by researchers including Rajendran and Chaudhuri [34], and Szwarc and Gupta [17]. Table 2 compares the performance of the three heuristics in terms of average error and CPU time. It is obvious that HA2 is much better than HA1 while HA3 is better than HA2 in terms of

120

Table 2 Performance evaluation of the heuristic algorithms, the branch-and-bound algorithm, and the dominance relations No. of jobs

No. of replicates

k

Heuristic algorithms Avg. CPU time (s)

HA1

HA2

HA3

HA1

HA2

HA3

Dominance relations

Avg. no. of nodes searched

Avg. frequency

Avg. CPU time (s) 0.06 0.12 0.12 0.08

GDR1

GDR2

LDR

2.0 3.16 2.86 2.82

9.58 10.32 8.08 5.08

11.48 13.56 9.88 7.6

10

50

0.25 0.50 0.75 1.0

0.051 0.050 0.045 0.039

0.023 0.022 0.019 0.021

0.005 0.010 0.010 0.003

0.005 0.008 0.008 0.013

0.006 0.007 0.009 0.015

0.003 0.021 0.009 0.013

201 1458 1134 313

15

50

0.25 0.50 0.75 1.0

0.061 0.046 0.049 0.046

0.019 0.014 0.013 0.012

0.009 0.010 0.005 0.004

0.009 0.008 0.009 0.009

0.021 0.010 0.018 0.019

0.050 0.052 0.048 0.051

756 147 98 818 395 355 441 509

39 5 19 23

3.68 6.24 5.6 5.3

23.48 17.5 16.58 15.0

27.02 23.64 21.8 20.32

20

50

0.25 0.50 0.75 1.0

0.032 0.033 0.048 0.039

0.011 0.014 0.008 0.014

0.009 0.010 0.006 0.009

0.009 0.009 0.010 0.009

0.031 0.034 0.034 0.034

0.170 0.172 0.172 0.170

3 442 252 3 745 791 583 668 4 669 116

675 712 590 754

6.78 9.68 9.38 13.58

39.94 36.32 33.48 32.62

46.44 45.36 42.88 46.56

25

30

0.25 0.50 0.75 1.0

0.044 0.042 0.039 0.047

0.008 0.009 0.010 0.008

0.006 0.005 0.009 0.006

0.048 0.043 0.041 0.048

0.101 0.098 0.094 0.101

0.498 0.502 0.504 0.501

38 212 242 1 964 324 27 112 678 24 554 521

1112 1010 1140 1098

11.14 14.72 16.06 18.18

59.94 54.88 50.94 46.48

69.38 68.82 65.54 62.92

30

20

0.25 0.50 0.75 1.0

0.061 0.052 0.042 0.044

0.008 0.011 0.009 0.010

0.005 0.010 0.006 0.009

0.069 0.065 0.061 0.069

0.180 0.186 0.182 0.180

0.910 0.912 0.912 0.912

354 049 040 357 579 530 156 369 036 250 855 838

1309 1280 1260 1202

14.42 23.62 27.38 26.82

95.18 75.72 78.56 73.28

108.4 96.46 104.0 98.14

35

10

0.50

0.040

0.008

0.003

0.090

0.210

2.05

894 845 092

1890

30.1

110.3

138.7

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

Avg. error

Branch-and-bound

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

121

average error. The overall average errors are 0.0452, 0.0129, and 0.0070 for HA1, HA2, and HA3, respectively. It seems that the average errors for all the three heuristic algorithms are independent of the choice of k. It also seems that the average errors for heuristics HA1 and HA3 are independent of the number of jobs while the average error for the heuristic HA2 decreases as the number of jobs increases. On the other hand, HA1 performs better than HA2 while HA2 performs better than HA3 when CPU time is concerned. As expected, the CPU time increases with the number of jobs for all the three heuristics. Table 2 reports the performance of the GDR1, GDR2, and LDR as well. The average frequency of GDR2 is roughly three times that of GDR1. Observe that it is the combination of the GDR1 and GDR2 that is important since if either GDR1 or GDR2 is satis"ed for a pair of jobs, then the corresponding nodes are eliminated. It seems that the average frequency of LDR is larger than that of GDR2. Table 2 also reports the average number of nodes searched and the average CPU time required for the branch-and-bound algorithm to "nd the optimal solution. In order to limit the time taken by a branch-and-bound algorithm, a threshold value is usually placed on the number of nodes generated, and the instances for which the generated nodes exceed this value are killed. For our branch-and-bound algorithm we did not set any threshold value, and hence, for all the problems generated the optimal solution was found. Had a threshold value been used, then the average CPU time and the average number of nodes searched would have been less.

7. Conclusions In this paper the two-machine #owshop problem, with sequence-independent setup times, has been addressed with respect to mean #owtime performance measure. The optimal solutions have been obtained for two special #owshops. Two new dominance relations have been established, and these relations along with a previously established dominance relation have been used in a branchand-bound algorithm. Furthermore, three heuristic algorithms have been proposed and their e$ciencies have been tested. A possible extension to this problem is to consider the #owshops with more than two machines and separated setup times. Another possible extension is to consider the two-machine #owshop problem with respect to mean #owtime performance measure where setup and removal times are separated from processing times. Yet another possible extension to the problem explored in this paper is to extend the results of this paper to stochastic environments, i.e., for environments in which setup times and/or processing times are modeled as random variables or for environments in which machines su!er random breakdowns. We conjecture that when machines are subject to random breakdowns, the results of the four theorems would also pertain to the stochastic case if both machines have identical counting processes with respect to the number of breakdowns, and both machines have identical breakdown duration distributions. We also conjecture that the heuristic algorithms stated in this paper would perform quite well if these conditions about counting processes and breakdown distributions hold.

122

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

Appendix A (Proof of Theorem 1) Consider two job sequences n and n : n : 2 , k, 2 , i, 2 and n : 2 , i, 2 , k, 2 where n is 1 2 1 2 1 a sequence in which job k is in position g and job i in position h, where g(h, whereas sequence n is obtained from sequence n by interchanging only the jobs in positions g and h. Assume that 2 1 (s #t #s ))(s #t #s ), s #t )s #t and t )t . We will show that i,1 i,1 k,2 k,1 k,1 i,2 i,2 i,2 k,2 k,2 k,2 i,2 TFT(n ))TFT(n ). That is, sequence n is no worse than sequence n , and hence, in an optimal 2 1 2 1 sequence job i should precede job k. It should be clear that ST (n )"St (n ) and ST (n )"ST (n ) since both g~1,1 1 g~1,1 2 g~1,2 1 g~1,2 2 sequences have the same job in position 1, 2, 2 , g!1. For j"g, d (n )"ST (n )#s #t !(ST (n )#s ), g 1 g~1,1 1 k,1 k,1 g~1,2 1 k,2 d (n )"ST (n )#s #t !(ST (n )#s ). g 2 g~1,1 2 i,1 i,1 g~1,2 2 i,2 From these two equations d (n )!d (n )"(s #t #s )!(s #t #s ), or g 2 g 1 i,1 i,1 k,2 k,1 k,1 i,2 d (n ))d (n ) g 2 g 1

(A.1)

since s #t #s )s #t #s . i,1 i,1 k,2 k,1 k,1 i,2 For j"g#1, g#2, 2 , h!1, j d (n )"ST (n )#s #t # + (s #t )![ST (n )#s #t j 1 g~1,1 1 k,1 k,1 *r,1+ *r,1+ g~1,2 1 k,2 k,2 r/g`1 j~1 # + (s #t )#s ], *r,2+ *r,2+ *j,2+ r/g`1 j d (n )"ST (n )#s #t # + (s #t )![ST (n )#s #t j 2 g~1,1 2 i,1 i,1 *r,1+ *r,1+ g~1,2 2 i,2 i,2 r/g`1 j~1 # + (s #t )#s ], *r,2+ *r,2+ *j,2+ r/g`1 where + g (s #t )"0. Since both sequences have the same job in all positions except for *r,2+ r/g`1 *r,2+ position g, it follows that d (n )!d (n )"(s #t #s #t )!(s #t #s #t ). j 2 j 1 i,1 i,1 k,2 k,2 k,1 k,1 i,2 i,2 But (s #t #s ))(s #t #s ) and t )t , therefore, i,1 i,1 k,2 k,1 k,1 i,2 k,2 i,2 d (n ))d (n ). j 2 j 1

(A.2)

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

123

For j"h, h~1 d (n )"ST (n )#s #t # + (s #t )#s #t h 1 g~1,1 1 k,1 k,1 *r,1+ *r,1+ i,1 i,1 r/g`1 h~1 ![ST (n )#s #t # + (s #t )#s ], g~1,2 1 k,2 k,2 *r,2+ *r,2+ i,2 r/g`1 h~1 d (n )"ST (n )#s #t # + (s #t )#s #t h 2 g~1,1 2 i,1 i,1 *r,1+ *r,1+ k,1 k,1 r/g`1 h~1 ![ST (n )#s #t # + (s #t )#s ]. g~1,2 2 i,2 i,2 *r,2+ *r,2+ k,2 r/g`1 From these last two equations, d (n )!d (n )"t !t , or h 2 h 1 k,2 i,2 d (n ))d (n ) h 2 h 1

(A.3)

since t )t . k,2 i,2 Since both sequences have the same job in positions 1, 2, 2 , g!1, d (n )"d (n ) for j"1, j 2 j 1 2, 2 , g!1. It can be shown that d (n )"d (n ) for j"h#1, h#2, 2 , n. From these facts and j 2 j 1 Eqs. (A.1)}(A.3), we have d (n ))d (n ) for j"1, 2, 2 , n. j 2 j 1

(A.4)

Observe that C (n )"C (n ) for j"1, 2, 2 , g!1, j 2 j 1

(A.5)

since both sequences have the same job in these positions. Let j "max M0, d , 2 , d N. Notice 1 1 g~1 that j (n )"j (n ). It can easily be shown that for j"h, h#1, 2 , n 1 1 1 2 C (n )!C (n )"maxMj (n ), d (n ), d (n ), , d (n )N j 2 j 1 1 2 g 2 g`1 2 2 j 2 !maxMj (n ), d (n ), d (n ), , d (n )N. 1 1 g 1 g`1 1 2 j 1 But it follows from Eq. (A.4) that maxMj (n ), d (n ), d (n ), , d (n )N)max Mj (n ), d (n ), d (n ), , d (n )N, 1 2 g 2 g`1 2 2 j 2 1 1 g 1 g`1 1 2 j 1 therefore, C (n ))C (n ) for j"h, h#1, 2 , n. j 2 j 1

(A.6)

124

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

Observe that for j"g, C (n )"ST (n )#s #t #maxMj (n ), d (n )N, g 1 g~1,2 1 k,2 k,2 1 1 g 1 C (n )"ST (n )#s #t #maxMj (n ), d (n )N, g 2 g~1,2 2 i,2 i,2 1 2 g 2 and for j"g#1, g#2, 2 , h!1 j C (n )"ST (n )#s #t # + (s #t ) j 1 g~1,2 1 k,2 k,2 *r,2+ *r,2+ r/g`1 #maxMj (n ), d (n ), d (n ), , d (n )N, 1 1 g 1 g`1 1 2 j 1 j C (n )"ST (n )#s #t # + (s #t ) j 2 g~1,2 2 i,2 i,2 *r,2+ *r,2+ r/g`1 #maxMj (n ), d (n ), d (n ), , d (n )N. 1 2 g 2 g`1 2 2 j 2 Hence, for j"g, g#1, 2 , h!1, C (n )!C (n )"(s #t )!(s #t )#maxMj (n ), 2 , d (n )N j 2 j 1 i,2 i,2 k,2 k,2 1 2 j 2 !maxMj (n ), 2 , d (n )N)0 1 1 j 1 since s #t )s #t and by Eq. (A.4), maxMj (n ), 2 , d (n )N)maxMj (n ), 2 , d (n )N. i,2 i,2 k,2 k,2 1 2 j 2 1 1 j 1 Therefore, C (n ))C (n ) for j"g, g#1, 2 , h!1. j 2 j 1 Now it follows from Eqs. (A.5)}(A.7) that

(A.7)

TFT(n ))TFT(n ). This completes the proof. K 2 1 Appendix B (Proofs of Theorems 2 and 4) To provide a framework for developing the proofs of the two theorems, consider exchanging the position of two adjacent jobs in a sequence. The sequence p has job i in position g and job k in 1 position g#1, and the sequence p is obtained from the sequence p by interchanging only the jobs 2 1 in positions g and g#1, i.e., p "2 , i, k, 2 and p "2 , k, i, 2 . 1 2 It follows from Eq. (3) that for these two sequences, d (p )"ST (p )#s #t !ST (p )!s , g 1 g~1,1 1 i,1 i,1 g~1,2 1 i,2 d (p )"ST (p )#s #t !ST (p )!s , g 2 g~1,1 2 k,1 k,1 g~1,2 2 k,2 d

(p )"ST (p )#s #t #s #t !ST (p )!s !t !s . g`1 1 g~1,1 1 i,1 i,1 k,1 k,1 g~1,2 1 i,2 i,2 k,2

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

125

Since both sequences have the same jobs in positions 1, 2, 2 , g!1, TI (p )"TI (p ). g~1 1 g~1 2 Let TI "TI (p )"TI (p ). Then, g~1 g~1 1 g~1 2 C (p )"ST (p )#s #t #maxMTI , d (p )N, g 1 g~1,2 1 i,2 i,2 g~1 g 1 C (p )"ST (p )#s #t #maxMTI , d (p )N, g 2 g~1,2 2 k,2 k,2 g~1 g 2 C (p )"ST (p )#s #t #s #t #maxMTI , d (p ), d (p )N, g`1 1 g~1,2 1 i,2 i,2 k,2 k,2 g~1 g 1 g`1 1 C (p )"ST (p )#s #t #s #t #maxMTI , d (p ), d (p )N. g`1 2 g~1,2 2 k,2 k,2 i,2 i,2 g~1 g 2 g`1 2 Observe that ST (p )"ST (p ) and ST (p )"ST (p ), because again both g~1,1 1 g~1,1 2 g~1,2 1 g~1,2 2 sequences have the same jobs in positions 1, 2, 2 , g!1. Then, it follows from the above equations that [C (p )#C (p )]![C (p )#C (p )] g 1 g`1 1 g 2 g`1 2 "(s #t )!(s #t )#maxMTI , d (p )N!maxMTI , d (p )N i,2 i,2 k,2 k,2 g~1 g 1 g~1 g 2 #maxMTI , d (p ), d (p )N!maxMTI , d (p ), d (p )N. g~1 g 1 g`1 1 g~1 g 2 g`1 2 It can be easily shown that d (p )"d (p ) for j"g#2, g#3, 2 , n. j 1 j 2 Let * "maxMd , d , 2 , d N. Then, by Eq. (B.2) j g`2 g`3 j * (p )"* (p )"* for j"g#2, g#3, 2 , n. j 1 j 2 j It can also be shown that for j"g#2, g#3, 2 , n,

(B.1)

(B.2)

C (p )!C (p )"maxMTI , d (p ), d (p ), * N!maxMTI , d (p ), d (p ), * N, (B.3) j 1 j 2 g~1 g 1 g`1 1 j g~1 g 2 g`1 2 j and, it is obvious that C (p )"C (p ) for j"1, 2, 2 , g!1. (B.4) j 1 j 2 It is clear that if d (p ))d (p ) and d (p ))d (p ), then it follows that g 1 g 2 g`1 1 g 2 maxMTI , d (p )N)maxM TI , d (p )N, g~1 g 1 g~1 g 2 maxMTI , d (p ), d (p )N)maxMTI , d (p ), d (p )N, and g~1 g 1 g`1 1 g~1 g 2 g`1 2 maxMTI , d (p ), d (p ), * N)maxMTI , d (p ), d (p ), * N. g~1 g 1 g`1 1 j g~1 g 2 g`1 2 j Therefore, it follows from Eqs. (B.1), (B.3) and (B.4) that in order to show that TFT(p ))TFT(p ), 1 2 it is su$cient to show that (i) s #t )s #t , (ii) d (p ))d (p ), and (iii) d (p ))d (p ). i,2 i,2 k,2 k,2 g 1 g 2 g`1 1 g 2 Proof of ¹heorem 2. By hypothesis, jobs i and k in the sequences p and p satisfy s #t 1 2 i,1 i,1 )s #t , s #t !s )s #t !s , and s #t )s #t . Observe that i,2 i,2 i,1 i,1 i,2 k,1 k,1 k,2 i,2 i,2 k,2 k,2 d (p ))d (p ) since s #t !s )s #t !s , and d (p ))d (p ) since s #t g 1 g 2 i,1 i,1 i,2 k,1 k,1 k,2 g`1 1 g 2 i,1 i,1 )s #t . This completes the proof. K i,2 i,2

126

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

Proof of ¹heorem 4. Assume the sequence p is an optimum sequence in which jobs do not 2 appear in nondecreasing order of s #t . Then, jobs k and i can be found, where j,1 j,1 s #t )s #t , such that i is sequenced immediately after k. We show that the sequence i,1 i,1 k,1 k,1 p is no worse than the sequence p . 1 2 By hypothesis s *s , then it follows that i,2 k,2 s #t !s )s #t !s . i,1 i,1 i,2 k,1 k,1 k,2 Therefore, d (p ))d (p ). Furthermore, for type B ordered #owshops s #t )s #t , and g 1 g 2 i,1 i,1 i,2 i,2 hence, d (p ))d (p ). Finally, for semi-ordered #owshops, s #t )s #t implies g`1 1 g 2 i,1 i,1 k,1 k,1 s #t )s #t . i,2 i,2 k,2 k,2 The nondecreasing order of s #t is obtained by repeating this procedure for each pair of j,1 j,1 jobs, which completes the proof since the relation s #t )s #t is transitive. K i,1 i,1 k,1 k,1 References [1] Johnson SM. Optimal two and three-stage production schedules with setup times included. Naval Research Logistics Quarterly 1954;1:69}81. [2] Pinedo M. Scheduling: theory, algorithms, and systems. Englewood Cli!s, NJ: Prentice-Hall, 1995. [3] Conway RW, Maxwell WL, Miller LW. Theory of scheduling. Reading, MA: Addison-Wesley, 1967. [4] Srikar BN, Ghosh S, A MILP model for the n-job, m-stage #owshop, with sequence-dependent setup times. International Journal of Production Research 1986;24:1459}72. [5] Bianco L, Ricciardelli S, Rinaldi G, Sassano A. Scheduling tasks with sequence-dependent processing times. Naval Research Logistics 1988;35:177}84. [6] Bitran GR, Gilbert SM. Sequencing production on parallel machines with two magnitudes of sequence-dependent setup cost. Journal of Manufacturing and Operations Management 1990;3:24}52. [7] Kim SC, Bobrowski PM. Impact of sequence-dependent setup-times on job shop scheduling performance. International Journal of Production Research 1994;32:1503}20. [8] Uzsoy R, Lee C, Martin-Vega A. Scheduling semiconductor test operations: minimizing maximum lateness and number of tardy jobs on a single machine. Naval Research Logistics 1992;39:369}88. [9] Sule DR, Huang KY. Sequencing on two and three machines with setup, processing and removal times separated. International Journal of Production Research 1983;21:723}32. [10] Yoshida T, Hitomi K. Optimal two-stage production scheduling with setup times separated. AIIE Transactions 1979;11:261}3. [11] Allahverdi A. Two-stage production scheduling with separated setup times and stochastic breakdowns. Journal of the Operational Research Society 1995;46:896}904. [12] Allahverdi A. Scheduling in stochastic #owshops with independent setup, processing, and removal times. Computers & Operations Research 1997;24:955}60. [13] Khurana K, Bagga PC. Minimizing the makespan in a two-machine #owshop with time lags and setup conditions. Zeitschrift Operations Research 1984;28:163}74. [14] Khurana K, Bagga PC. Scheduling of job-block with deadline in n]2 #owshop problem with separated setup times. Indian Journal of Pure Applied Mathematics 1985;16:213}24. [15] Rajendran C, Ziegler H. Heuristics for scheduling in a #owshop with setup, processing, and removal times separated. Production Planning and Control 1997;8:568}76. [16] Szwarc W. The #owshop problem with time lags and separated setup times. Zeitschrift Operations Research 1986;30:B15}22. [17] Szwarc W, Gupta JND. A #ow-shop problem with sequence-dependent additive setup times. Naval Research Logistics 1987;34:619}27.

A. Allahverdi / Computers & Operations Research 27 (2000) 111}127

127

[18] Allahverdi A, Gupta JND, Aldowiasan TA. A review of scheduling research involving setup considerations. OMEGA The International Journal of Management Sciences 1999;27:219}39. [19] Dileepan P, Sen T. Job lateness in a two-machine #owshop with setup times separated. Computers & Operations Research 1991;18:549}56. [20] Allahverdi A, Aldowiasan TA. Job lateness in #owshops with setup and removal times separated. Journal of the Operational Research Society 1998;49:1001}6. [21] Gonzalez T, Sahni S. Flow shop and job shop schedules. Operations Research 1978;26:36}52. [22] Della Croce F, Narayan V, Tadei R. The two-machine total completion time #ow shop problem. European Journal of Operational Research 1996;90:227}37. [23] Bagga PC, Khurana K. Two-machine #owshop with separated sequence-independent setup times: mean completion time criterion. Indian Journal of Management and Systems 1986;2:47}57. [24] Sta!ord EF, Tseng FT. On the Srikar-Ghosh MILP model for the N]M SDST #owshop problem. International Journal of Production Research 1990;28:723}32. [25] Chu C. A branch-and-bound algorithm to minimize total tardiness with di!erent release dates. Naval Research Logistics 1992;39:265}83. [26] Daniels RL, Chambers RJ. Multiobjective #owshop scheduling. Naval Research Logistics 1990;37:981}95. [27] Smith ML, Panwalkar SS, Dudek RA. Flowshop sequencing problem with ordered processing time matrices. Management Science 1975;21:544}9. [28] Panwalkar SS, Dudek RA, Smith ML. Sequencing research and the industrial scheduling problem. In: Elmaghraby SE, editor. Symposium on the Theory of Scheduling and its Applications. Berlin: Springer, 1973:29}37. [29] Panwalkar SS, Khan AW. An ordered #ow-shop sequencing problem with mean completion time criterion. International Journal of Production Research 1976;30:631}5. [30] Gupta JND, Darrow WP. Approximate schedules for the two-machine #owshop with sequence dependent setup times. Indian Journal of Management and Systems 1985;1:6}11. [31] Gupta JND, Darrow WP. The two-machine sequence dependent #owshop scheduling problem. European Journal of Operational Research 1986;24:439}46. [32] Gupta JND. Flowshop schedules with sequence dependent setup times. Journal of Operations Research Society of Japan 1986;29:206}19. [33] Hall NG, Posner ME. Generating experimental data for scheduling problems (revised for) Operations Research. [34] Rajendran C, Chaudhuri D. An e$cient heuristic approach to the scheduling of jobs in a #owshop. European Journal of Operational Research 1991;61:318}25. Ali Allahverdi is an Associate Professor in the Department of Mechanical and Industrial Engineering of Kuwait University. He worked for two years as an Assistant Professor in the Industrial Engineering Department of Marmara University, Istanbul, before coming to Kuwait. He received his B.S. in Petroleum Engineering from Istanbul Technical University and his M.Sc. and Ph.D. in Industrial Engineering, from Rensselaer Polytechnic Institute. His current research interests include scheduling in both stochastic and deterministic environments. He has published in journals such as Computers & Operations Research, Naval Research Logistics, European Journal of Operational Research, The Journal of the Operational Research Society, OMEGA The International Journal of Management Science, International Transactions in Operational Research, Mathematical and Computer Modeling, and Computers and Industrial Engineering.