Single facility scheduling with nonlinear processing times

Single facility scheduling with nonlinear processing times

Computers ind. Engng Vol. 14, No. 4, pp. 387-393, 1988 Printed in Great Britain. All rights reserved 0360-8352/88 $3.00+0.00 Copyright © 1988 Pergamo...

373KB Sizes 0 Downloads 171 Views

Computers ind. Engng Vol. 14, No. 4, pp. 387-393, 1988 Printed in Great Britain. All rights reserved

0360-8352/88 $3.00+0.00 Copyright © 1988 Pergamon Press pie

SINGLE FACILITY S C H E D U L I N G WITH N O N L I N E A R P R O C E S S I N G TIMES JATINDERN. D.

GUPTA 1 and

SUSHIL K.

GUPTA 2

1Department of Management Science, Ball State University, Muncie, IN 47303 and 2Department of Decision Sciences, Florida International University, Miami, FL 33199, U.S.A.

(Received for publication 10 December 1987) Abstract--This paper considers the static single facility scheduling problem where the processing times of jobs are a monotonically increasing function of their starting (waiting) times and the objective is to minimize the total elapsed time (called the makespan) in which all jobs complete their processing. Based on the combinatorial analysis of the problem, an exact optimization algorithm is developed for the general processing time function which is then specialized for the linear case. In view of the excessive computational burden of the exact optimization algorithm for the nonlinear processing time functions, heuristic algorithms are proposed. The effectiveness of these proposed algorithms is empirically evaluated and found to indicate that these heuristic algorithms yield optimal or near optimal schedules in many cases. INTRODUCTION

Consider the following static single facility problem: a set of n independent, single-operation jobs, is ready for being processed at time zero on a single machine. Neither job splitting nor machine idleness are allowed. The processing time of each job depends on its starting (or waiting) time in the sequence. It is desired to find that processing order (schedule) which minimizes the m a k e s p a n , defined as the total elapsed time in which all jobs complete their processing. Such situations often occur in many chemical and metallurgical processes. For example, in steel rolling mills, ingots are heated to the required temperature before rolling. In this case, the furnace is the single facility and ingots are the independent jobs to be processed (heated). Heating time depends upon the ingot's current temperature, which depends upon the time it has been waiting. During the waiting period, the ingot cools down thus requiring more heating time in the furnace. It is desired to minimize the total time spent by all available ingots in the heating shop. For each job i, let Pi and ti be its processing and starting times respectively. Since the processing time of job i depends on its starting time, the relationship between Pi and ti can be represented as: (1)

Pi = f(ti).

The exact form of the function f in equation (1) depends upon the specific production process under consideration. Mathematically, the linear, the quadratic, and the general forms are expressed as follows: (2)

Pi = ai + biti (linear) Pi = ai + biti + cit 2

(quadratic)

Pi = ai + biti + ci t2 + • • • + m i t m

(general)

(3) (4)

where ai, bi, Ci, m i are non-negative constants. As stated before, the objective is to find the schedule (or an order) S = ([1], [2] . . . . [K] . . . . . [n]), where [K] is the job in Kth position that will minimize makespan, T ( S ) , computed as follows: •





,

T(S)

= Z Pi,I i=1

387

(5)

388

JATINDER N. D. GUPTA and SUSHIL K. GUPTA

Minimizing makespan on a single machine is trivial if the processing times of jobs are independent of their starting times since all schedules give the same makespan. However, for the scenario presented above, finding a schedule that minimizes makespan is neither trivial nor easy. In fact, the problem appears to be NP-complete implying that it is very unlikely that polynomially bounded solution techniques will be found for its solution [1]. Branch and bound procedures are not useful since it is very difficult (if not impossible) to develop general expressions for the lower bounds on makespan. Therefore, the combinatorial approaches based on the sequence dominance conditions are used to describe an exact optimization algorithm for the general processing time function and a specialized version for the linear case. In view of the relative inefficiency of combinatorial approaches several heuristic approaches are discussed to find approximate solutions to the problem. COMBINATORIAL APPROACH

Consider two schedules S = PQ and S' = P'Q where the partial schedules P and P' are different permutations of the same subset of jobs. Let the completion time of partial schedule P be represented as T(P). The developments in flowshop scheduling [2,3], lead to the following result.

Theorem 1. For two partial schedules P and P' that are different permutations of the same subset of jobs, T(P) <- T(P') implies that T(PQ) <- T(P'Q). The condition in Theorem 1 above is both necessary and sufficient for developing optimization algorithms for the general case considered here. Combinatorial algorithm for the general case Based on Gupta's [4] and Schild and Fredman's [5] algorithms for scheduling problems with deferral costs, the combinatorial algorithm for the general case of the present problem can be described as follows: (1) Start the list with n partial schedules each containing only one job. Compute their completion times. (2) For each of these partial schedules, generate new partial schedules by augmenting each of the unscheduled jobs to the end of the partial schedule. Compute the completion times of all partial schedules thus obtained. (3) Group the partial schedules such that the partial schedules that are different permutations of the same subset of jobs belong to the same group. In each group of such partial schedules, retain the one with minimum completion time. (4) Repeat steps 2 and 3 until a complete schedule containing n jobs is retained in step 3. This is an optimal schedule since it minimizes makespan. As an illustration of the above proposed combinatorial algorithm, consider the four-job problem where the processing time of a job is a quadratic function of its starting time, as in equation (3) above. Table 1 below depicts the values of various parameters for each of the four jobs. The steps of the combinatorial algorithm are best performed in a tabular fashion as shown in Table 2. To start the process, each job is considered at the first sequence position as shown in Table 2 under the column Iteration 1. Then, pairs are formed and only the partial schedules with

Table 1. A four-job example problem Job(i)

ai

bi

c

1 2 3 4

0.23 0.27 0.13 0.29

0.80 0.88 0.09 0.60

0.75 0.85 0.35 0.98

0.23

0.27

0.13

0.29

1

2

3

4

* This shows the partial schedule retained.

T(P)

p

Iteration 1

*

*

*

0.399 0.709

0.770 0.4498 0.7934

0.4766 0.5280 0.5145 0.8150 0.8867 0.4755

1-3 1-4

2-1 2-3 2-4

3-1 3-2 3-4 4-1 4-2 4-3

*

*

0.747

1-2

*

r(P)

e

Iteration 2

1.2567

1.3574

4--3--2

1.1919 1.2085 2.1290 1.2145

1.1395 2.0330 1.1554 1 .O844 2.0301 1.0788

r(P)

Iteration 3

4--3--1

2-3-1 2-3--4 2-4--1 2-4-3

1-2-3 1-2-4 1-3-2 1-3--4 1-4-2 1-4-3

t'

Table 2. Illustration of the combinatorial algorithm

* *

*

*

2-3-4-1

1-4-2-3 1-4-3-2

1-2-3--4

P

Iteration 4

3.4988

3.7852 3.2844

3.3856

r(P)

=-.

~r o t~

390

JATINDER N. D. GUPTAand SUSHIL K. GUPTA

least completion times are retained continued until complete schedules From the calculations in Table 2 problem of Table 1 is 1 A 3--2 with

as shown in the column labelled Iteration 2. This process is are obtained as shown in Table 2. above, it follows that the optimal schedule for the example a makespan of 3.2844.

Linear processing time function case The processing time of any job i, as a linear function of its starting (or waiting) time, is given by equation (2). For each job i, define: hi = ai/bi. Theorem 2. For the linear case, the schedule obtained by arranging jobs in ascending order of h i values is optimal. Proof. Consider two schedules S = PijQ and S' = PjiQ. It is obvious that: T(Pij) = T ( P ) + a i + b i T ( P ) + a / + bj{T(P) + a i + b i T ( P ) } . Similarly:

T(Pji) = T(P) + aj + bjT(P) + ai + bi{T(P) + aj + biT(P)}. From these two equations, it follows that T(Pij) <<,T(Pji) if bjai <<-biaj. Thus ai/bi ~ aj/bi (or hi <~ hi) implies that T(Pij) <~ T(Pji). Using theorem 1 then shows that T(S) <<, T(S'). If P is empty, then it follows that a job with least hi will be the first job in the sequence since it will dominate all other jobs. When repeated for other sequence positions, this argument shows that the schedule obtained by arranging jobs in ascending order of the ratio hi minimizes makespan. The above theorem also shows that the linear processing time case can be solved in polynomially computational time since the computational effort involved is of the order of O(nlogn). HEURISTIC ALGORITHMS

In view of the excessive computational burden of the combinatorial approach, it is desirable to seek heuristic algorithms for this problem. Two heuristic approaches are explored here, namely: static and dynamic heuristic algorithms. The main difference between the two algorithms is the manner in which the relative priority of a job is calculated. In the dynamic heuristic algorithm, job priority is calculated dynamically and hence it may change depending on the sequence position being considered whereas in the static case it does not change.

Static heuristic algorithm Since the linear case can be optimized by arranging jobs in ascending order of their weighted processing times, it is natural to extend this approach to non-linear processing times. To do so, for each job i, define: hi(a) = ai hi(b) = alibi hi(c) = ai/c i

hi(m) = ai/m i Then the static heuristic algorithm has the following form: (1) For each job i, calculate hi(a), hi(b), hi(c) . . . . , hi(m) using the above equations. (2) Generate m schedules by arranging jobs in the ascending order of the hi(x ) values, x = a, b, c , . . . , m. Break ties by using x = b or c for a, x = a or c for b, and x = a or b for c, etc. (3) Find the makespan of the schedules so obtained and accept the one with minimum makespan as an approximate solution.

Single facility scheduling Table 3.

391

hi(x) for the example problem

Job(i)

hi(a)

hi(b )

hi(c)

1 2 3 4

0.23 0.27 0.13 0.29

0.29 0.31 1.45 0.48

0.31 0.32 0.37 0.30

Table 4. Schedules obtained by static heuristic algorithm Processing times for jobs in positions Rule

Schedule

1

2

3

4

Makespan

hi(a) hi(b) h,(c)

3-1-24 1-2-4-3 4-1-2-3

0.13 0.23 0.29

0.3466 0.5174 0.5250

0.8825 1.2858 1.5518

2.9150 1.7598 2.3039

4.274 3.793 4.670

As an illustration of the static heuristic algorithm, consider the four-job problem of Table 1 above. The ratios hi(x) are given in Table 3. The schedules obtained by arranging jobs in ascending order of these ratios in Table 3 are given in Table 4. Schedule 1-2-4-3 is the best schedule with a makespan of 3.793.

Dynamic heuristic algorithm Partition all jobs in two mutually exclusive subsets P and Q. The jobs in subset P have already been scheduled. Let the completion time of the last job in P be T(P). For each job i e Q fijad the ratio Hi(x), x = a, b, c , . . . , m. For the quadratic processing time function given in equation (3), Hi(x) are defined as follows: Hi(a) = ai/{bi+ciT(P)} H i ( b ) = {ai+biT(P)+ciT(p)2}/{bi+2ciT(P)}, and

Hi(c) = ai+biT(P)+ciT(P) 2 Similar expressions for Hi(x) can be developed for the general non-linear processing time function given in equation (4). The dynamic and static heuristic algorithms follow the same steps except that in the dynamic case, implementation of the step 2 is as follows: (i) (ii) (iii) (iv) (v)

Find job j with minimum Hj(x), j ~ Q. Append job j to sequence P and find T(Pj). Let P = Pj, Q = Q - {j}, and T(P) = T(Pj). If Q is empty, go to step (v), otherwise, find Hi(x) for each j ~ Q and go to step (i). P represents the heuristic schedule and T(P) is the makespan.

To illustrate the dynamic heuristic algorithm, consider the example problem of Table 1 again. The calculations are shown in Table 5 below and results in the same solution as by the static heuristic algorithm.

Improving the solution The above heuristic algorithms were found to be rather unpredictable in finding an optimal (or close to optimal) makespan schedules. Therefore, the following neighborhood search technique was used to improve their effectiveness: (0) L e t i = 1, and j = 2. (1) Let the best schedule generated by the heuristic algorithm be S = ([1], [2], . . . [k] . . . . [n]), where, [k] is the job in kth position. Let T(S) be the makespan for S.

,

0.f~XI (I.130 0.477 1.276 4.051

O.O~Xk') O. 2300 0.7473 2.0331 3.7929

0.00~ O.2300 0.7473 2.0331 3.7929

T(P)

#e O P T

17 9 7 2

# OPT

18 18 17 10

n

4 5 6 7

n

4 5 6 7

*Not necessary to find these values

0 3 3-1 3-1-4 3-1-4-2

Schedule for x = c

0 1 1-2 1-2-4 1-2-4-3

Schedule for x = b

0 1 1-2 1-2-4 1-2-4-3

Schedule for x = a

e

0.0 0.0 0.0 0.0

MIN

0.0 0.0 0,0 0.0

(1.13 * * * *

1.450 0.674 0.640 1.163 *

1.4500 0.762(I (1.3698 O. 1622 *

H3(x)

0.155 1.440 0.951 5.618

MAX

14 7 6 3

~ OPT

0.003 l).004 0.016 0.385

AVR

Single pass

0.048 0.050 0.230 5.618

MAX

19 20 19 15

#e O P T

3 1 4 2 *

1 2 4 3 *

1 2 4 3 *

Job selected

0.0 0.0 0.0 0.0

MIN

0.0 0.0 0.0 0.0

MIN

D o u b l e pass

0.002 0.000 0.000 0.324

AVR

0.0196 0,1534 0.3712 12.0285

AVR

D y n a m i c heuristic algorithm

0.290 0.385 0.799 * *

(I.4800 (). 4566 0.6227 * *

(I.4800 O. 3513 (1.2176 * *

H4(x)

Table 7. E x p e r i m e n t a l results of using neighborhood search technique

0.026 0.112 0.302 1.444

AVR

Static heuristic algorithm MIN

0.270 0.399 (I.883 2,775 *

(1.310 0.407 * * *

(1.31(1 (1.251 * * *

H2(x)

T a b l e 6. C o m p a r a t i v e evaluation of the p r o p o s e d heuristic a l g o r i t h m s

0.230 0.347 * * *

0.29 * * * *

11.29 * * * *

Hi(x)

T a b l e 5. Schedules o b t a i n e d by d y n a m i c heuristic algorithm

0.048 0.000 0.004 5.618

MAX

0.153 1.534 2.952 134.540

MAX

0.130 0.347 0.799 2.775 *

(I.2300 0.5173 1.2858 1,7598

0.2300 (1.5173 1.2858 1.7598 *

Processing time for job selected

,v

~7 ©

:z

z

Single facility scheduling

393

(2) Let T(S') be the makespan of the schedule obtained by interchanging [i] and [j]. If T(S') < T(S), let S = S', i = j and j = j + 1. If j < n, return to step 1, otherwise enter step 3. (3) Set i = i+ 1. If i < n, set j = i+ 1 and return to step 1 otherwise enter step 4. (4) Accept schedule S with makespan T(S) as an approximate solution. The schedule S may still be improved by applying neighborhood search once more (i.e. repeating steps 1 through 3 above with seed as the schedule obtained at step 4). COMPUTATIONAL EXPERIENCE

To test the effectiveness of proposed heuristic algorithms in finding optimal or near optimal schedules, the proposed optimization algorithm, both heuristic algorithms and the neighborhood search technique were used to solve problems varying in size from 4 to 7 jobs. A set of 20 problems was solved for each problem size. The ai, bi, and ci values were generated from a uniform distribution in the range [0,1]. For each problem the effectiveness of the kth heuristic algorithm, q~,, is defined as: qk = (Pk - Po)IPo

where Pk and Po are the kth heuristic and optimal solutions respectively. Table 6 depicts the number of optimal solutions (~ OPT); the minimum (MIN), average (AVR) and maximum (MAX) values of q for each problem size and for each algorithm. The difference in the effectiveness of the two heuristic algorithms is not appreciable. In fact the static heuristic algorithm seems to be more effective than the dynamic case, especially for large size problems. Moreover, static rules require less computational effort than the dynamic heuristic algorithm. Therefore, more experimentation was done with the static heuristic algorithm. Neighborhood search technique was used to improve the solution obtained by the static heuristic algorithm. Table 7 shows the results of these experiments with a single and a double pass of the neighborhood search procedure. The results in Table 7 above show that the neighborhood search technique improves the quality of the heuristic solution considerably. However, the computational effort is much more than for the static heuristic algorithm only. The tradeoff depends on other managerial considerations and the values assigned to a reduction in makespan. CONCLUSIONS

This paper has described exact and heuristic algorithms for finding a minimum makespan schedule for a single facility scheduling problem when the processing times of jobs are nonlinar functions of their starting times. While the problem appears to be in the NP-complete category, the proposed algorithms do provide a practical way to approach the problem. Further refinements and improvements may be possible to provide even more efficient and effective solution procedures. Nevertheless, the suggested approaches are useful in finding practical and workable schedules for these difficult problems. Even if the branch and bound approaches to solve these problems do become available in the future, the proposed heuristic algorithms can be used to find an initial solution to act as an upper bound to curtail the domain of search. REFERENCES 1. M. R. Garey and D. S. Johnson. Computers and Intractability: A Guide to the Theory of NP-Completeness. Freeman, San Francisco (1979). 2. J. N. D. Gupta. An improved combinatorial algorithm for the flowshop scheduling problem. Opns Res. 20, 1753-1758 (1971). 3. J. N. D. Gupta. A review of flowshop scheduling research. Disaggregation: Problems in Manufacturing and Service Organizations (Edited by L. P. Ritzman et al.) Martinus Nijhoff, The Hague (1979). 4. J. N. D. Gupta. Optimal scheduling in a multi-stage flowshop. A H E Trans. 4, 238-243 (1972). 5. A. Schild and I. J. Fredman. Scheduling tasks with deadlines and nonlinear loss functions. Mgmt Sci. 9, 73-81 (1962).