Performance guarantees for a scheduling problem with common stepwise job payoffs

Performance guarantees for a scheduling problem with common stepwise job payoffs

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games [m3G; v1.141; Prn:17/10/2014; 11:25] P.1 (1-18) Theoretical Computer Sci...

1MB Sizes 0 Downloads 45 Views

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.1 (1-18)

Theoretical Computer Science ••• (••••) •••–•••

Contents lists available at ScienceDirect

Theoretical Computer Science www.elsevier.com/locate/tcs

Performance guarantees for a scheduling problem with common stepwise job payoffs Yasmina Seddik ∗ , Christophe Gonzales, Safia Kedad-Sidhoum Laboratoire d’Informatique de Paris 6, Université Paris 6 Pierre et Marie Curie, 4 place Jussieu, 75005 Paris, France

a r t i c l e

i n f o

Article history: Received 26 April 2013 Received in revised form 15 September 2014 Accepted 1 October 2014 Available online xxxx Communicated by G. Ausiello Keywords: Absolute approximation guarantee Relative approximation guarantee Scheduling Stepwise payoffs Release dates

a b s t r a c t We consider a single machine scheduling problem with unequal release dates and a common stepwise payoff function for the jobs. The goal is to maximize the sum of the jobs payoffs, which are defined with regard to some common delivery dates. The problem is strongly NP-hard. In this paper, we propose a polynomial time approximation algorithm with both absolute and relative performance guarantees. The time complexity of the approximation algorithm is O ( K N log N ), N being the number of jobs and K the number of delivery dates. © 2014 Elsevier B.V. All rights reserved.

1. Introduction

We consider a single machine scheduling problem with unequal release dates and a common stepwise payoff function for the jobs. The problem addressed in this paper originates from a real issue in book digitization, where a manufacturer digitizes the collection of the French National Library. The books to be digitized arrive at the manufacturer over time. The client wishes to receive the digitized books as soon as possible. For this reason, some delivery dates are set by the client. Each digitized book yields a payoff to the manufacturer: the earlier its delivery, the greater the payoff. Each book has the same significance regarding the payoffs. We therefore consider a single machine scheduling problem where a set J all of N jobs J 1 , . . . , J N must be scheduled nonpreemptively on a single machine. Each job J i has a processing time p i > 0 and a release date r i ≥ 0. K delivery dates are given: 0 < D 1 < D 2 < · · · < D K ; as well as K payoff values γ1 > γ2 > · · · > γ K > 0. We assume that all parameters are integer. A schedule S is defined as a vector of job completion times. We denote by C i ( S ) the completion time of job J i in S. The payoff v i ( S ) of job J i in schedule S is defined by the following decreasing stepwise payoff function.

*

Corresponding author. E-mail addresses: [email protected] (Y. Seddik), [email protected] (C. Gonzales), safi[email protected] (S. Kedad-Sidhoum).

http://dx.doi.org/10.1016/j.tcs.2014.10.010 0304-3975/© 2014 Elsevier B.V. All rights reserved.

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.2 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

2

⎧ γ1 ⎪ ⎪ ⎪ ⎪ ⎪ γ ⎪ ⎨ 2 . v i ( S ) = .. ⎪ ⎪ ⎪ ⎪ γK ⎪ ⎪ ⎩ 0

if 0 < C i ( S ) ≤ D 1 if D 1 < C i ( S ) ≤ D 2 if D K −1 < C i ( S ) ≤ D K if D K < C i ( S )

N

The payoff v ( S ) of a schedule S is computed as: v ( S ) = i =1 v i ( S ). The objective  is to maximize v ( S ). Extending the three-field notation of Grahamet al. [1], the addressed problem is denoted as 1|r i | vi . The particular case of 1|r i | v i where  the values γ1 , . . . , γ K are fixed as γk = K − k + 1 for k = 1, . . . , K , has been proven to be strongly NP-hard [2]. Hence 1|r i | v i is strongly NP-hard.  The contribution of this paper is to provide a polynomial time approximation algorithm for 1|r i | v i , with an absoK lute performance guarantee of γ , and an approximation ratio equal to 1 / 2. To the best of our knowledge, only a k=2 k few works (detailed below) consider scheduling problems with stepwise payoff/cost functions, and none of them proposes approximation algorithms.  The criterion v i is a special case of stepwise job payoff function criterion, since all the jobs have the same stepwise payoff function. The most general case is the one where each job has a distinct stepwise payoff function, while an intermediate case is the one where the breakpoints (where the function value changes) are common but the payoff values are job related. Detienne et al. [3] consider the single machine problem without release dates and with job related stepwise cost functions. They provide an exact method dealing with a graph representation of the scheduling problem and Lagrangian bounds. The problem with release dates is also considered but the proposed method is much less efficient in this case. Curry and Peters [4] deal with an online problem on parallel machines with reassignments, where stepwise increasing job cost functions must be minimized. At each job arrival a new schedule is computed with the available jobs, using a Branch and Price method. Janiak and Krysiak [5] consider the single machine problem without release dates and with job related stepwise payoff functions. They provide some list strategies heuristics for this problem, based on processing  times and payoff values of the jobs; and two strategies based on modifications of Moore–Hodgson’s algorithm for 1|| U i [6]. Moore–Hodgson’s algorithm is also exploited by Tseng et al. [7] for the same problem, as a part of a constructive heuristic algorithm. Moreover, Tseng et al. [7] define some neighborhood structures and a Variable Neighborhood Search method. An adaptation of the Moore–Hodgson’s algorithm [6] is also used in this paper, and is presented as SDD-algorithm in Section 2.2. The special case where the breakpoints are common to all the jobs is considered by Yang [8], which provides a Branch and Bound method. For the same problem, where additionally the number of breakpoints is fixed, Janiak and Krysiak [5] provide a pseudopolynomial time algorithm. Janiak and Krysiak [9] consider the variant of the objective function where jobs have the same breakpoints, but the payoff stepwise function is nondecreasing, which implies that some breakpoints are not significant for some of the jobs. They propose several heuristics to solve the problem with unrelated parallel processors. The works cited above do not consider release dates. Notice that the problem addressed in this paper becomes polynomial if release dates are not considered [2]. As for problems considering job related stepwise cost functions and release dates, Detienne et al. [10] propose a column generation approach for solving the parallel machines problem, while Sahin and Ahuja [11] propose mathematical programming and heuristic approaches for the single machine problem. Both these works consider job related stepwise cost functions. Seddik et al. [2] consider a single machine problem with a particular case of common stepwise payoff function for the jobs: complexity results are established, and a pseudopolynomial time algorithm for the problem with two delivery dates is proposed.  v i , regarding the presence of a set of common delivery or due Finally, two other kinds of criteria are related to dates. First, Hall et al. [12] study the class of problems with fixed delivery dates. They consider several classical scheduling criteria, always including the following variant: the cost of a job J i depends on the earliest delivery date occurring after the completion of J i . In addition, complexity results are established for several problems, with different criteria and machine configurations. Second, there exists another class of problems related to the common due dates: the generalized due date problem [13], where due dates are not related to the jobs. Instead, global due dates are defined, and before each of them, one job must complete. Then, given a schedule, the i-th scheduled job is related to the i-th due date, and its cost is computed in relation to that due date, as for classical due dates. Complexity results have been established by Hall et al. [13] for this class of problems. The rest of this paper is organized as follows. Section 2 states some preliminaries. In Section 3 the polynomial time approximation algorithm is presented, as well as a sketch of proof of its absolute performance guarantee and some lemmas; the complete proof is given in Sections 4 and 5. In Section 6 we prove the approximation ratio of the algorithm, and in Section 7 we present some experimental results. Finally, we draw some conclusions in Section 8. 2. Preliminaries In this section, we first give some general definitions and notations; then we propose the polynomial time SDD-algorithm (which is an extension of the algorithm given in [2]) for solving a special case of 1|r i | v i , which is needed for the design of the approximation algorithm.

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.3 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

3

In order to refer more easily to the notations and definitions during the reading, we grouped them as much as possible (except for local notations) in Sections 2.1, 3, 4.1. 2.1. Notations and definitions Definitions.

• A schedule is a vector of completion times, one for each job of J all : (C 1 , . . . , C N ), while a sequence is an ordered set of jobs.

• A partial schedule is a vector of completion times, one for each job of a given set J ⊂ J all . • A feasible (partial) schedule is a (partial) schedule where each scheduled job starts at or after its release date, and where jobs do not overlap.

• A subschedule of a given schedule S is a partial schedule S  where the completion times of the jobs of S  are the same in both S and S  .

• A block is a (partial) schedule where there are no idle times, i.e. each job of the block (except the first one) starts at the completion time of the preceding job.

• In a schedule S, a straddling job J i is a job such that C i ( S ) − p i < D k < C i ( S ), for some k ∈ {1, . . . , K } Notations.

 • We set D 0 = 0 and D K +1 = max{ D K , maxi =1,..., N r i } + iN=1 p i . • We set γ K +1 = 0. • For any (partial) schedule S, we denote by V k ( S ) the number of jobs completing into [0, D k ], k = 1, . . . , K . K Notice that v ( S ) = k=1 (γk − γk+1 ) × V k ( S ) (cf. Property 5, Section 2.3). • For any (partial) schedule S (resp. sequence of jobs Γ ), we denote by J ( S ) (resp. J (Γ )) the set of jobs of S (resp. Γ ). • For any (partial) schedule S, we denote by C max ( S ) (resp. B ( S )) the completion time of the last job (resp. the starting time of the first job) of S.  • For any (partial) schedule S (or sequence of jobs Γ , or set of jobs J ), we denote p ( S ) = J i ∈J ( S ) p i (respectively,   p (Γ ) = J i ∈J (Γ ) p i , p (J ) = J i ∈J p i ). • S i . S j denotes the concatenation of partial schedule S i with partial schedule S j (assuming that J ( S i ) ∩ J ( S j ) = ∅). • Any (partial) schedule S can be partitioned into K + 1 subschedules S 1 , . . . , S K +1 ; S k being the subschedule of the jobs completing into ] D k−1 , D k ], for k = 1, . . . , K + 1. Thus, S can be expressed as S = S 1 , S 2 , . . . , S K +1 . Moreover, if S k is empty, we set C max ( S k ) = B ( S k ) = D k−1 , for k = 1, . . . , K + 1. Definition. Given two jobs J i and J j of J all , we denote by J i ≺ERD J j the assertion that J i precedes J j in the ERD order (for Earliest Release Date and shortest processing time):

• r i < r j ⇒ J i ≺ERD J j ; • (r i = r j and p i < p j ) ⇒ J i ≺ERD J j ; • (r i = r j and p i = p j and i < j ) ⇒ J i ≺ERD J j . The third condition is clearly arbitrary, in order to break ties and to ensure that ERD is a total order. 2.2. SDD-algorithm In this section, we introduce a polynomial time algorithm called Single Delivery Date algorithm (SDD-algorithm) that solves the problem SDD I , defined below. Problem SDD I . Given an interval I = [ I low , I up ] and a set of jobs J , find a feasible (partial) schedule of the jobs of J into the interval I , with the maximum number of jobs. Remark. Notice that single delivery date problem 1|r i , K = 1|  unique delivery date of 1|r i , K = 1| v i ) and J = J all .



v i is equivalent to SDD I when I low = 0, I up = D (with D the

Notations. For any sequence Γ of jobs, and any job J q :

• ∀ Jq ∈ / J (Γ ), J q .Γ denotes the sequence obtained by prepending J q to Γ . • ∀ J q ∈ J (Γ ), Γ \ J q denotes the sequence obtained by removing J q from Γ , leaving all the other jobs in the order given by Γ . • S (Γ ) denotes the (partial) schedule that schedules the jobs of Γ , in the order given by Γ , without idle times and with the last job completing at I up .

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.4 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

4

In SDD-algorithm, depicted in Algorithm 1, the jobs of J (input of SDD I ) are ordered w.r.t. ERD order, i.e. J i 1 ≺ERD · · · ≺ERD J i |J | . At each iteration m (m = |J |, . . . , 1), the SDD-algorithm constructs a sequence Γim with the maximal number of jobs and the shortest processing time, such that J (Γim ) ⊆ { J im , . . . , J i |J | } and that S (Γim ) is feasible (i.e. each job J i j ∈ Γim is scheduled into [max(r i j , I low ), I up ]) [2]. Moreover, in each sequence Γim , m = |J |, . . . , 1, the jobs are ordered w.r.t. ERD order.  It is worth noting that SDD-algorithm is close to the Moore–Hodgson algorithm [6] for solving 1|| U i .

8

Input: J = { J i 1 , . . . , J i |J | }, I low , I up Output: S Γi |J |+1 ← ∅, t ← I up for m = |J | down to 1 do Γim ← J im .Γim+1 t ← t − p im if t < max{r im , I low } then q ← max{l| p il = max{ p i j | J i j ∈ J (Γim )}} Γi m ← Γi m \ J i q t ← t + p iq

9

return S ← S (Γi 1 )

1 2 3 4 5 6 7

Algorithm 1: SDD-algorithm. The SDD-algorithm runs in O ( N log N ) time if a heap is used for the search of the job with the largest processing time. Some useful properties on SDD-algorithm are given below. Property 1. Given a set J of jobs and an interval [ I low , I up ], the SDD-algorithm produces a partial schedule that schedules the maximal number N (J , I low , I up ) of jobs of J between I low and I up . Property 2. Given a set J of jobs and an interval [ I low , I up ], the SDD-algorithm produces a partial schedule with the shortest processing time among all the feasible partial schedules of N (J , I low , I up ) jobs of J into [ I low , I up ]. For the arguments of the proofs of Properties 1 and 2, refer to the proof of optimality of SDD-algorithm detailed in [2]. Property 3. The SDD-algorithm produces a partial schedule where the jobs are ordered w.r.t. ERD order. Property 4. When, at a given iteration of the SDD-algorithm, a job is removed from the current sequence, it is the job with the longest processing time, or the job with the highest ranking in ERD order in case of a tie. Properties 3 and 4 hold by construction. 2.3. An upper bound on the optimal payoff Property 5. The payoff v ( S ) can also be computed as: v ( S ) =

K

γ − γk+1 ) × V k ( S ).

k=1 ( k

Property 5 is true since the payoff can be computed by counting the payoff earned at each delivery date, instead of counting it per job. Hence, an upper bound on the optimal payoff can be computed as follows:

u_bound =

K  (γk − γk+1 ) × U k k =1

where U k is the number of jobs scheduled by SDD(J all , 0, D k ), k = 1, . . . , K . 3. The approximation algorithm In this section, we present the polynomial time algorithm APPROX yielding a feasible schedule, then we formulate the main result, i.e. that APPROX is an approximation algorithm with absolute performance guarantee. For any feasible (partial) schedule S, we denote by ls( S) the (partial) schedule obtained after left-shifting all the jobs of S, i.e. where each job starts as soon as possible, either at its release date or at the completion time of the preceding job in the schedule. Algorithm APPROX, described in Algorithm 2, yields a feasible schedule S A = S 1A . S 2A . . . . . S KA +1 .

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.5 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

5

Input: J all , D 1 , . . . , D K Output: S A , l_bound 1 2 3 4 5 6 7 8 9

J ← J all S 1A ← SDD(J , 0, D 1 ) S 1A ← ls( S 1A ) J ← J \J ( S 1A ) for k = 2 to K do S kA ← SDD(J , C max ( S kA−1 ), D k ) S kA ← ls( S kA ) J ← J \J ( S kA ) S KA +1 ← schedule obtained by scheduling the jobs of J in any order after D K

K

A k=1 k × |J ( S k )| A A A S 1 . S 2 . . . . . S K +1 , l_bound

10

l_bound ←

11

return

γ

Algorithm 2: Algorithm APPROX.

Fig. 1. Illustration of yk = B ( S kA ) − C max ( S kA−1 ).

Since the complexity of SDD-algorithm at each iteration is O ( N log N ), and the left-shifting operation is performed in O ( N ) time, the complexity of algorithm APPROX is O ( K N log N ). Assumption. We assume in the sequel that the jobs of J all are reindexed in ERD order: J 1 ≺ERD · · · ≺ERD J N . Before stating the main result, let us introduce some notations that we will frequently use in the following.

• Let S A be the feasible schedule obtained with algorithm APPROX. Let S 1A:k = S 1A . . . . . S kA be the partial schedule equal to S A between 0 and C max ( S kA ), k = 2, . . . , K . • We denote by S D k , the (partial) schedule obtained by applying the SDD-algorithm on the jobs of J all , on the interval [0, D k ]: S D k = SDD(J all , 0, D k ), k = 1, . . . , K .  • We call U k the number of jobs of S D k : U k = |J ( S D k )|, k = 1, . . . , K . We recall that u_bound = kK=1 (γk − γk+1 ) × U k is an upper bound on the optimal payoff (cf. Section 2.3).

• For any job J i ∈ J all , we denote by Ji+ (resp. Ji− ) the set of jobs of J all that follow J i in ERD order, i.e. with greater indices (resp. that precede J i in ERD order, i.e. with smaller indices).

• We set: yk = D k − p ( S kA ) − C max ( S kA−1 ), k = 2, . . . , K . If S kA is a block completing at D k , yk = B ( S kA ) − C max ( S kA−1 ), k = 2, . . . , K (see Fig. 1). Notice that v ( S 1A: K ) = v ( S A ), because the jobs completing after D K do not provide any payoff. Theorem 1. The payoff of schedule S A yielded by algorithm APPROX is at a distance at most OPT −

K

k=2

K

k=2

γk from the optimal payoff: v ( S A ) ≥

γk .

Sketch of proof. The proof of Theorem 1 mainly relies on Proposition 2 (Section 5) which states the following: for any k ∈ {1, . . . , K }, we have U k − V k ( S A ) ≤ k − 1. By introducing values γ1 , . . . , γ K into these inequations, and by summing the K obtained inequations over all k ∈ {1, . . . , K }, we obtain that u_bound − v ( S A ) ≤ k=2 γk , which induces Theorem 1.

The proof of Proposition 2 is done by contradiction: we assume that there exists some kˆ ∈ {1, . . . , K } such that U kˆ −

ˆ ). To prove Proposition 2, we construct a schedule called S m . S m is constructed starting V kˆ ( S A ) > kˆ − 1 (Hypothesis H

from S A ˆ and is such that J ( S A ˆ ) ∩ J ( S D kˆ ) ⊆ J ( S m ) ⊆ J ( S D kˆ ). The proof of Proposition 2 relies on schedule S m and on 1:k

Lemmas 1, 2 and 3.

1:k

Next, we introduce three lemmas each of them stating a property that concerns schedules S D k and S 1A:k , and which holds for any k ∈ {2, . . . , K }. For a given k ∈ {2, . . . , K }, we therefore only focus on interval [0, D k ].

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.6 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

6

Lemma 1. For any k ∈ {2, . . . , K }, for any pair of jobs J d , J l such that:

• J d ∈ J ( S D k )\J ( S 1A:k ), and • J l ∈ J ( S kA ), and • Jl+ ∩ J ( S kA ) ⊆ J ( S D k ), we have: pd ≥ pl . Proof. The proof is given in Appendix A.

2

Lemma 2 states that all the jobs of S kA are scheduled in S D k , k ∈ {2, . . . , K }. Lemma 2. J ( S kA ) ⊆ J ( S D k ), k ∈ {2, . . . , K }. Proof. The proof is given in Appendix A.

2

We now introduce Lemma 3, which refers to yk , defined in Section 3. Lemma 3. For any job J i of J ( S D k )\J ( S 1A:k ): p i > yk , k = 2, . . . , K . Proof. The proof is given in Appendix A.

2

ˆ and define schedule S m and its features. In Section 5, we prove Proposition 2 and In Section 4, we assume hypothesis H Theorem 1. 4. Partial schedule S m To set the approximation ratio, we have to bound U k − V k ( S A ), where U k = |J ( S D k )| and V k ( S A ) = |J ( S 1A:k )|, k = 1, . . . , K . To this end, we introduce a partial schedule S m , which we define here under the following assumption: there exists some k ∈ {1, . . . , K } such that U k − V k ( S A ) > k − 1. Let kˆ be the smallest k satisfying this condition. Since U 1 = V 1 ( S A ) = |J (SDD(J all , 0, D 1 ))|, we have kˆ ∈ {2, . . . , K }. In the sequel, we only handle schedules S D kˆ and S A ˆ , on the interval [0, D kˆ ]; we can thus assume, without loss of 1:k

generality, that S ˆA is a block completing at D kˆ . k We introduce some notations in Section 4.1, then define partial schedule S m in Section 4.2. 4.1. Preliminaries Notations. Let a¯ be the number of jobs of J ( S D kˆ )\J ( S A ˆ ) = { J e1 , . . . , J ea¯ } ( J e1 ≺ERD · · · ≺ERD J ea¯ , i.e. e 1 < · · · < ea¯ ). Let d be the number of jobs of J ( S A ˆ )\J ( S D kˆ ).

1:k

1:k

Note that, by Lemma 2, d is also equal to the number of jobs in J ( S A ˆ

1:k−1

ˆ we have: 0 ≤ d ≤ a¯ − k. ˆ |J ( S D kˆ )| ≥ |J ( S A ˆ )| + k,

ˆ i.e. )\J ( S D kˆ ). Moreover, since U kˆ ≥ V kˆ ( S A ) + k,

1:k

Notation. Let nˆ = |J ( S ˆA )|, and J f 1 , . . . , J f nˆ be the jobs of S ˆA ( J f 1 ≺ERD . . . ≺ERD J f nˆ ). k k From Lemma 2, J f 1 , . . . , J f nˆ ∈ J ( S D kˆ ). On Fig. 2 an example is given, where a¯ = 6, d = 3 and nˆ = 5. The same example will be used to illustrate the rest of the proof. Since J ( S A ˆ ) and J ( S ˆA ) constitute a partition of J ( S A ˆ ), a partition of J ( S D kˆ ) is constituted of the three sets of 1:k−1

jobs:

• J ( S D kˆ ) ∩ J ( S A ˆ • •

),

1:k−1 J ( S ) ∩ J ( S ˆA ) = J ( S ˆA ) = { J f 1 , . . . , k k J ( S D kˆ )\J ( S A ˆ ) = { J e1 , . . . , J ea¯ }. 1:k D kˆ

1:k

k

J f nˆ } (by Lemma 2),

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.7 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

7

Fig. 2. On top, the schedule represents S 1A:3 , while the schedule at bottom represents S D 3 , for the same instance. In both schedules, the striped jobs belong to J ( S 1A:2 ) ∩ J ( S D 3 ). In S 1A:2 , the white jobs belong to J ( S 1A:2 )\J ( S D 3 ). On this example, we have: a¯ = 6 (the number of J e i jobs), d = 3 (the number of white jobs in S 1A:2 ), nˆ = 5 (the number of J f j jobs).

Hence, { J f 1 , . . . , J f nˆ } and { J e1 , . . . , J ea¯ } constitute a partition of J ( S D kˆ )\J ( S A ˆ

1:k−1

ˆ d + k.

Notation. Let G be the set of the d + kˆ − 1 first jobs of J ( S D kˆ )\J ( S A ˆ

1:k−1

Equivalently, G is also the subset of jobs of J ( S D kˆ )\J ( S A ˆ

1:k−1

1:k−1

)| = a¯ + nˆ ≥

) in ERD order.

) with the d + kˆ − 1 smallest indices. Moreover, the jobs of

G are also the d + kˆ − 1 earliest scheduled jobs of J ( S D kˆ )\J ( S A ˆ

1:k−1

Fig. 2, G = { J e1 , J f 1 , J e2 , J e3 , J f 2 }.

). Thus, |J ( S D kˆ )\J ( S A ˆ

) in S D kˆ , from Property 3 on S D kˆ . On the example of

Notations. Let J g be the job of G that has the greatest index (i.e. the last job of G in ERD order). Let G e = G ∩ { J e1 , . . . , J ea¯ } and G f = G ∩ { J f 1 , . . . , J f nˆ }. On the example of Fig. 2, J g = J f 2 , G e = { J e1 , J e2 , J e3 } and G f = { J f 1 , J f 2 }. G e and G f constitute a partition of G, since

G ⊆ J ( S D kˆ )\J ( S A ˆ

), and the two sets { J e1 , . . . , J ea¯ } and { J f 1 , . . . , J f nˆ } are a partition of J ( S D kˆ )\J ( S A ˆ ). Therefore, 1:k−1 |G e | + |G f | = |G | = d + kˆ − 1. Since the jobs of G have the smallest indices among the jobs of J ( S D kˆ )\J ( S A ˆ ), we deduce that { J e1 , . . . , J ea¯ } ∩ G = 1:k−1 { J e1 , . . . , J e|G e | } (since e 1 < . . . < e |G e | < . . . < ea¯ ), and { J f 1 , . . . , J f nˆ } ∩ G = { J f 1 , . . . , J f f } (since f 1 < . . . < f |G f | < . . . < f nˆ ). 1:k−1

|G |

4.2. Partial schedule S m In the current section, we construct an intermediate feasible partial schedule S m in between S A ˆ and S D kˆ , as S m will 1:k contain all the jobs of J ( S A ˆ ) ∩ J ( S D kˆ ) and some other jobs of S D kˆ . By the means of S m , we will show in Proposition 2 1:k

ˆ i.e. if U ˆ − V ˆ ( S A ) ≥ k, ˆ then there must be a job of J ( S D kˆ )\J ( S A ) whose (Section 5) that, if |J ( S D kˆ )| ≥ |J ( S A ˆ )| + k, k k 1:k 1:kˆ processing time is shorter than ykˆ = B ( S ˆA ) − C max ( S ˆA k

k−1

), thus leading to a contradiction with Lemma 3.

We first show how S m is constructed, then prove some properties of S m and S D kˆ , and finally show that S m is feasible. Partial schedule S m is constructed in the following way, starting from S A ˆ (see Fig. 3). S m is composed of two subsched1:k

m A m ules S m α and S β . S α is a left-shifted subschedule obtained from S ˆ

1:k −1

by removing the d jobs of J ( S A ˆ

1:k−1

)\J ( S D kˆ ), by

adding the d + kˆ − 1 jobs of G to S A ˆ and by rescheduling all these jobs in the order of increasing indices (ERD order). 1:k−1 m S β is the partial schedule obtained by removing J f 1 , . . . , J f f from S ˆA . We now introduce some features of partial schedule S m .

|G |

k

Feature 1. S m α is left-shifted. Proof. By construction.

2

Feature 2. S m β is a right-shifted block that completes at D kˆ . A Proof. By construction: S m β is the second part of the schedule S ˆ , which is also a block that completes at D kˆ . More precisely,

Sm β is the block that corresponds to the sequence ( J f

k

|G f |+1

, . . . , J f nˆ ) and that completes at D kˆ (see Fig. 3). 2

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.8 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

8

Fig. 3. Starting from S 1A:3 , we obtain S m : the d = 3 jobs of J ( S 1A:2 )\J ( S D 3 ) are removed from S 1A:2 ; the |G e | = 3 jobs of G ∩ (J ( S D 3 )\J ( S 1A:3 )) are added m to S 1A:2 , and the |G f | = 2 jobs of J ( S 3A ) ∩ G are moved from S 3A to S m α . Finally, the jobs of S α are ordered w.r.t. ERD order.

A Feature 3. |J ( S m α )| = |J ( S ˆ

1:k−1

)| + kˆ − 1.

A ˆ ˆ Proof. By construction: to obtain S m α , d jobs are removed from S 1:kˆ −1 , while d + k − 1 are added (i.e. the d + k − 1 jobs of G). 2

Feature 4. J ( S m ) ⊆ J ( S D kˆ ). Proof. By construction: to construct S m we remove from S A ˆ the jobs that are not in S D kˆ and we add the jobs of G ⊆ 1:k J ( S D kˆ ). 2 Feature 5. C max ( S m α ) > D kˆ −1 . A Proof. We have |J ( S m α )| = |J ( S ˆ

1:k−1

)| + kˆ − 1 (cf. Feature 3). Moreover, |J ( S D kˆ −1 )| ≤ kˆ − 2 + |J ( S A ˆ

smallest k ∈ {1, . . . , K } satisfying the condition U k − V k ( S ) > k − 1. Hence, |J ( S α )| > |J ( S since no more than |J ( S

D kˆ −1

A

m

D kˆ −1

1:k−1

)|, since kˆ is the

)|, which implies the result )| jobs can complete into [0, D kˆ −1 ], by Property 1 on schedule S D kˆ −1 . 2

A A + m Feature 6. The jobs that are scheduled later than J g in S m α all belong to J ( S 1:kˆ −1 ): J g ∩ J ( S α ) ⊆ J ( S 1:kˆ −1 ). A Proof. By construction, J ( S m α )\J (G ) ⊆ J ( S ˆ m

J ( S A ˆ ). 1:k−1

1:k−1

). Since J g is the last scheduled job of G in S m α , the jobs scheduled later

than J g in S α all belong to Moreover, since by construction the jobs in S m α are ordered by increasing indices, + m m J g ∩ J ( S α ) is actually the set of jobs scheduled after J g in S α . Hence, the result holds. 2 Feature 7. The sequence of jobs scheduled before J g is identical in S D kˆ and S m α. Proof. First, note that J g ∈ G ⊆ J ( S D kˆ ). Since in both S D kˆ and S m α the jobs are ordered by increasing indices, we need to show: J g− ∩ J ( S D kˆ ) = J g− ∩ J ( S m ) . α A By construction of S m α , the two sets J ( S ˆ

1:k−1

A ) ∩ J ( S D kˆ ) and G constitute a partition of J ( S m α ). Therefore, J ( S ˆ

1:k−1

)∩

− − J ( S ) ∩ J g− and G ∩ J g− constitute a partition of J ( S m α ) ∩ J g . Notice that these two sets are also a partition of J g ∩ D kˆ D kˆ D kˆ A − − − − m J ( S ), since G ∩ J g = (J g ∩ J ( S ))\J ( S ˆ ). Hence, J g ∩ J ( S ) = J g ∩ J ( S α ). 2 D kˆ

1:k−1

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.9 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

9

+ Feature 8. J ( S m β ) ⊆ Jg .

Proof. We have J ( S m ) ⊆ J ( S D kˆ ) (cf. Feature 4). Then, since the jobs that precede J g in S m are the same that the ones preceding J g in S D kˆ (cf. Feature 7), we deduce that the jobs that follow J g in S m are a subset of the jobs that follow J g in

S D kˆ . From Property 3 on S D kˆ , the jobs that follow J g in S D kˆ are those of J g+ ∩ J ( S D kˆ ). Then, since all the jobs of S m β are D kˆ m m m + + scheduled after J g in S (because J g ∈ J ( S α )), we deduce that J ( S β ) ⊆ J g ∩ J ( S ) ⊆ J g . 2 m Feature 9. In S D kˆ , all the jobs of J g+ ∩ J ( S D kˆ ) are scheduled after the completion time of J g in S m α , C g ( S α ).

Proof. From Property 3 on S D kˆ , all the jobs of J g+ ∩ J ( S D kˆ ) are scheduled after J g . Dˆ m Let us show that J g cannot complete earlier than C g ( S m α ) in S k . From Feature 7, the sequence of jobs scheduled in S α D kˆ D kˆ m between 0 and C g ( S m ) is the same as that of the jobs scheduled in S between 0 and C ( S ) . Therefore, since S is g α α Dˆ Dˆ left-shifted (cf. Feature 1), J g cannot complete earlier than C g ( S m α ) in S k . Consequently, all the jobs that follow J g in S k Dˆ are scheduled after C g ( S m α ) in S k .

2

Feature 9a. In particular, the result of Feature 9 is true for any subset of jobs of J g+ ∩ J ( S D kˆ ), and thus for J g+ ∩ J ( S m ) (cf. Feature 4): Dˆ all the jobs of J g+ ∩ J ( S m ) are scheduled after C g ( S m α ) in S k .

Feature 10. p (J g+ ∩ J ( S D kˆ )) ≤ D kˆ − C g ( S m α ). Dˆ Proof. Since, by Feature 9, all the jobs of J g+ ∩ J ( S D kˆ ) are scheduled between C g ( S m α ) and D kˆ in S k , their total processing m time must be less than or equal to D kˆ − C g ( S α ). 2

Feature 10a. In particular, the sum of the processing times of the jobs of J g+ ∩ J ( S m ) is also less than or equal to D kˆ − C g ( S m α ). Notice that even though J ( S m ) ∩ J g+ ⊆ J ( S D kˆ ) ∩ J g+ , the sequence of the jobs of J ( S m ) ∩ J g+ in S m is not a sub-

sequence of the sequence of jobs of J ( S D kˆ ) ∩ J g+ in S D kˆ . Indeed, the jobs scheduled after J g in S D kˆ are scheduled in the order of their increasing indices, while the jobs scheduled after J g in S m are partitioned in two groups: the jobs of m + J (Sm α ) ∩ J g , which are ordered by increasing indices, and the jobs of S β , which are also ordered by increasing indices; but m + m there is no guarantee that the jobs of (J ( S m α ∩ J g )) ∪ J ( S β ) are scheduled in the increasing order of their indices in S .

+ m Feature 11. In S m α , there are no idle times between any pair of jobs of { J g } ∪ (J g ∩ J ( S α )). + m Proof. By contradiction, suppose that there is an idle time in S m α between a pair of jobs of { J g } ∪ (J g ∩ J ( S α )). Then, m + m since S α is left-shifted (cf. Feature 1) there is at least one job J q of J g ∩ J ( S α ) that starts exactly at its release date rq . A + m Moreover, { J q } ∪ (Jq+ ∩ J ( S m α )) ⊆ J g ∩ J ( S α ) ⊆ J ( S ˆ

1:k−1 m

) (cf. Feature 6). Furthermore, the jobs of (Jq+ ∩ J ( S m α )) ∪ { J q }

A are all scheduled after rq in S m α and in S 1:kˆ −1 . Since S α is left-shifted and its jobs are in ERD order, which minimizes the C max criterion [14], we deduce that S A ˆ cannot complete earlier than C max ( S m α ) > D kˆ −1 (cf. Feature 5), which is clearly a 1:k−1

contradiction with the definition of S A ˆ . 1:k−1

2

Proposition 1. S m is feasible. Proof. By construction, each job starts at or after its release date in S m . Therefore, schedule S m is unfeasible if and only if m m m C max ( S m α ) > B ( S β ). The proof is done by contradiction: assume that C max ( S α ) > B ( S β ).

+ m Therefore, if we examine S m between C g ( S m α ) and D kˆ (see Fig. 4), we see that the jobs of J g ∩ J ( S α ) are left-shifted, m m + ∩ J ( S m )). On the other starting from C g ( S m ) in a unique block (cf. Features 9a and 11), hence C ( S ) = C ( S ) + p ( J max α g α g α m m hand, the jobs of S m β are right-shifted, completing at D kˆ in a unique block (cf. Feature 2), hence B ( S β ) = D kˆ − p ( S β ).

m m m + m + m m + C max ( S m α ) > B ( S β ) implies that p ((J g ∩ J ( S α )) ∪ J ( S β )) = p (J g ∩ J ( S α )) + p ( S β ) > D kˆ − C g ( S α ). Hence, p (J g ∩ m m m J ( S )) > D kˆ − C g ( S α ) (cf. Feature 8), which is in contradiction with Feature 10a. Consequently, S is feasible. 2

5. Absolute performance guarantee Proposition 2. For any k ∈ {1, . . . , K }, we have U k − V k ( S A ) ≤ k − 1.

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.10 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

10

Fig. 4. Illustration of schedule S m , where z < y 3 + p f 1 + p f 2 .

Proof. By contradiction, we assume that kˆ is the smallest k ∈ {2, . . . , K } such that U k − V k ( S A ) ≥ k. Hence, feasible partial schedule S m can be obtained as described in Section 4. We have the following partitions of J ( S D kˆ ):

       J S D kˆ = J S D kˆ ∩ J S A ˆ ∪ J S D kˆ \J S A ˆ 1:k 1:k    = J S D kˆ ∩ J S A ˆ ∪ { J e1 , . . . , J ea¯ } 1:k    = J S D kˆ ∩ J S A ˆ ∪ G e ∪ { J e|G e |+1 , . . . , J ea¯ } 1:k  = J S m ∪ { J e|G e |+1 , . . . , J ea¯ }. Hence, J ( S D kˆ ) ∩ J g+ = (J ( S m ) ∩ J g+ ) ∪ { J e|G e |+1 , . . . , J ea¯ }, since { J e|G e |+1 , . . . , J ea¯ } ⊆ J g+ , since g < e |G e |+1 , by construction of G and by definition of g. m m + + m Recall that J ( S m ) ∩ J g+ = (J ( S m α ) ∩ J g ) ∪ J ( S β ), since J ( S β ) ⊆ J g (cf. Feature 8). Then, since the three sets J ( S α ) ∩

D kˆ D kˆ + + J g+ , J ( S m β ) and { J e |G e |+1 , . . . , J ea¯ } are disjoint, they constitute a partition of J ( S ) ∩ J g . Consequently, p (J ( S ) ∩ J g ) = m m + m p (J ( S α ) ∩ J g ) + p ( S β ) + p ({ J e|G e |+1 , . . . , J ea¯ }) ≤ D kˆ − C g ( S α ) (cf. Feature 10). + m m m + We have: p (J ( S m α ) ∩ J g ) = C max ( S α ) − C g ( S α ), since the jobs of J ( S α ) ∩ J g are scheduled without idle times between m m C g (Sm ) and C ( S ) in S (cf. Feature 11). max α α m m m m Moreover, p ( S m β ) = D kˆ − B ( S β ), since the jobs of J ( S β ) are scheduled without idle times between B ( S β ) and D kˆ in S

(cf. Feature 2). m m m m m We deduce that p ({ J e|G e |+1 , . . . , J ea¯ }) ≤ D kˆ − C g ( S m α ) − (C max ( S α ) − C g ( S α )) − ( D kˆ − B ( S β )) = B ( S β ) − C max ( S α ); which can be written as:

a¯

i =|G e |+1 m

m pei ≤ B ( S m β ) − C max ( S α ).

m A Moreover, we have: B ( S β ) − C max ( S m α ) < B ( S β ) − C max ( S ˆ A Additionally, B ( S m β ) = B(S ˆ ) +

|G f |

k

i =1

k−1

p , by construction of S m (see Fig. 3). j =1 f j

k−1 a¯ i =|G e |+1

a¯ −|G e |



) because C max ( S ˆA ) ≤ D kˆ −1 < C max ( S m α ) (cf. Feature 5).

|G f | | G f | ) = B ( S ˆA ) − C max ( S ˆA ) + j =1 p f j = ykˆ + j =1 p f j , by definition of ykˆ . k k−1 | G f | p e i < ykˆ + p , which can be written as: j =1 f j

A Hence, B ( S m β ) − C max ( S ˆ

From what precedes,

k−1

|G |  f

p e i+|G e | < ykˆ +

pfj

j =1

Notice that a¯ − |G e | > |G f |, since |G f | = |G | − |G e | = d + kˆ − 1 − |G e | and d ≤ a¯ − kˆ (cf. Section 4.1). Moreover, for all i ∈ {1, . . . , |G f |}:

• J ei+|G e | ∈ J ( S D kˆ )\J ( S A ˆ ), 1:k

• J f i ∈ J ( S ˆA ), and k

• J f+i ∩ J ( S ˆA ) ⊆ J ( S D kˆ ) (since J ( S ˆA ) ⊆ J ( S D kˆ ) by Lemma 2). k

k

Therefore, by Lemma 1: ∀i ∈ {1, . . . , |G f |}, p e i+|G e | ≥ p f i . Hence,

a¯ −|G e |

i =|G f |+1

p e i+|G e | < ykˆ ; and this sum has at least one

term, since a¯ − |G | > |G |. Therefore, p ea¯ < ykˆ , while, from Lemma 3, p ea¯ > ykˆ , which is a contradiction. e

f

Theorem 1. v ( S A ) ≥ OPT −

K

k=2

γk .

2

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.11 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

11

4

Fig. 5. An instance for which h = 5 and OPT − v ( S A ) = k=2 γk . On top, an optimal schedule where all the jobs are executed in the order J 1 , J 2 , J 3 , J 4 , J 5 , J 6 , J 7 . Below, the schedule provided by the approximation algorithm that contains jobs J 2 , J 4 , J 6 , J 7 .

Proof. By Proposition 2, we have:





U k − V k S A ≤ k − 1,

⇒ ⇒

∀k ∈ {1, . . . , K }  A  ≤ (k − 1)(γk − γk+1 ), (γk − γk+1 ) U k − V k S

K 

(γk − γk+1 )U k −

k =1

∀k ∈ {1, . . . , K }

K 

K  k =1

 (γk − γk+1 ) V k S A ≤





k =1





K 

 (k − 1)(γk − γk+1 ) − v S A



u_bound − v S A ≤ γ2 − γ3 + 2γ3 − 2γ4 + 3γ4 − 3γ5 + · · · + ( K − 2)γ K −1 − ( K − 2)γ K + ( K − 1)γ K



u_bound − v S A ≤

γk

2

k =2

Some special cases. Algorithm APPROX guarantees that at each delivery date, at most one job is “missed” in comparison with an optimal solution. As a consequence, the smaller are the values γk , for k = 2, . . . , K , the better is the approximation guarantee. In particular, for the case studied in [2], where γk = K − k + 1, k = 1, . . . , K , the performance guarantee is equal to K ( K − 1)/2. In the latter case, if K = 2, we even have an absolute performance guarantee equal to 1, which is the best possible guarantee, since all payoffs are integer. More generally, an absolute performance guarantee equal to 1 is obtained for the two delivery dates problem when γ2 = 1, whatever is the value of γ1 . Tightness property. The approximation guarantee stated in Theorem 1 is tight, as shown by the following family of instances, with h + 2 jobs and h − 1 delivery dates, h > 2, and:

• • • • •

p i = 2h−i +1 , i = 1, . . . , h + 1, and p h+2 = 20 ; i −1 r1 = 0 and r i = j =1 p j , for i odd in {3, . . . , h + 2}; r i = r i −1 + p i , for i even in {2, . . . , h + 2}; D 1 = p 1 and D i = D i −1 + p 2(i −1) + p 2(i −1)+1 for i > 1; γ1 , . . . , γh−1 are arbitrary.

For each of these instances, we have: OPT = γ1 + 2 h = 5 is given in Fig. 5.

K

k=2

γk , while v ( S A ) =

K

k=1

γk , where K = h − 1. An example for

6. Relative performance guarantee In this section, we show that algorithm APPROX has an approximation ratio equal to 1/2 and that this bound is tight. Notations. We extend the notations U k and S A to U k (π ) and S A (π ), for a given instance

π.

Definition 1. Given an instance π , we construct instances πk , k = 1, . . . , K , as follows. The jobs of πk are the jobs of π . Let k = {k} ∪ { f ∈ {1, . . . , k − 1}| V f ( S A (π )) − V f −1 ( S A (π )) > 0} (we set V 0 ( S A (π )) = 0). The delivery dates of πk are { D f | f ∈ k }. In other words, to construct πk we remove from π each deadline D f such that f > k or no job completes in ] D f −1 , D f ] in S A (π ), except for D k which is not removed even if it satisfies the latter condition. We denote by V f ( S A (πk )) the number of jobs scheduled by APPROX in the interval [0, D f ] for instance πk , where D f is the f -th deadline of π , f ∈ k , k = 1, . . . , K .

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.12 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

12

Lemma 4. The following equalities hold, for any k ∈ {1, . . . , K }:

• U f ( S A (πk )) = U f ( S A (π )), f ∈ k . • V f ( S A (πk )) = V f ( S A (π )), f ∈ k . Proof. For each equality of the first set, the quantity on each side is the number of jobs scheduled by SDD-algorithm taking as input the same set of jobs (the jobs of I ) and the same interval [0, D f ]. Hence, the two values are equal. Regarding the second set of equalities, we claim that schedules S A (πk ) and S A (π ) are identical on the interval [0, D f ]. For both π and π  we call step m of the algorithm the step consisting in applying SDD-algorithm on the interval between the completion time of the last job scheduled at the previous step (completing at or before D m−1 ) and D m . Some steps performed for π are not performed for π  . By induction, suppose that the same schedule is constructed for both instances until step m − 1 (which is true for m = 1). Let us show that the two schedules are also equal right after step m. If no job is scheduled at step m for π , no step m is performed for π  . In both schedules, no job is added, and the schedules are equal. Otherwise, since the schedules are equal until the previous step, SDD-algorithm is applied in both cases on the same set of jobs and on the same interval. Hence, the same sequence of jobs is added to the schedule. 2 Proposition 3. For any instance π , we have: V k ( S A (π )) ≥ U k (π )/2, k = 1, . . . , K . Proof. The assertion is true for k = 1 since by construction V 1 ( S A ) = U 1 since both values are obtained by applying SDDalgorithm on the same interval for the same set of jobs. For a given k ∈ {2, . . . , K }, suppose that U k (π ) = 0, then so is V k ( S A (π )) and the proposition holds. Now, suppose that U k (π ) ≥ 1 and let us prove the proposition by contradiction. Suppose that there exists k ∈ {2, . . . , K } such that U k (π ) > 2V k ( S A (π )), or equivalently that U k (π ) − V k ( S A (π )) > V k ( S A (π )). By Proposition 2, U k (π ) − V k ( S A (π )) ≤ k − 1, hence V k ( S A (π )) ≤ k − 2. We have V k ( S A (π )) > 0 (otherwise it can be easily shown that U k (π ) = 0), so let q ∈ {2, . . . , k − 1} be such that V k ( S A (π )) = k − q. Then, by hypothesis, U k (π ) − V k ( S A (π )) > k − q. V k ( S A (π )) = k − q also implies that, in S A (π ), there are at least q intervals where no job is completed. Let us consider the instance πk obtained from π as described in Definition 1. In πk , there are at most k − q + 1 delivery dates, hence, by Proposition 2, U k (πk ) − V k ( S A (πk )) ≤ k − q. Finally, by Lemma 4, U k (πk ) = U k (π ) and V k ( S A (πk )) = V k ( S A (π )), hence U k (π ) − V k ( S A (π )) ≤ k − q, which is a contradiction. 2 Theorem 2. If at least one job can be completed before D K , then v ( S A ) > OPT /2. Proof. We have:

OPT ≤ U 1 γ1 +

K K −1   (U k − U k−1 )γk = U 1 (γ1 − γ2 ) + U k (γk − γk+1 ) + U K γ K k =2

k =2

and:





v S A = V 1 γ1 +

K K −1   ( V k − V k−1 )γk = V 1 (γ1 − γ2 ) + V k (γk − γk+1 ) + V K γ K k =2

k =2

By Proposition 3 we have: K −1  k =2

V k (γk − γk+1 ) + V K γ K ≥

1 2

K −1 

U k (γk − γk+1 ) + U K γ K .

k =2

Moreover, by hypothesis, there exists an index k ∈ {1, . . . , K } such that U k > 0. Let m be the smallest such index. Then, by construction, V m = U m ≥ 1 since both values result from the application of SDD-algorithm on the same interval for the same set of jobs. Therefore, as γ1 − γ2 > 0, V m (γm − γm+1 ) = U m (γm − γm+1 ) > 12 U m (γm − γm+1 ). Hence the result holds. 2 Tightness property. The approximation ratio stated in Theorem 2 is tight, as shown by the following family of instances, with 2h jobs and 2h delivery dates, for h ≥ 1, and:

• • • • •

D 1 = 2h+1 and D m = D m−1 + 2h−m/2+1 , for m = 2, . . . , 2h; r1 = 0 and r2i +1 = D 2i , for i = 1, . . . , h − 1; r2i = r2i −1 + 2h−i +1 , for i = 1, . . . , h; pm = 2h−m/2+1 , m = 1, . . . , 2h; γ1 , . . . , γ2h arbitrary.

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.13 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

13

Fig. 6. An instance for which v ( S A )/OPT tends to 1/2, where h = 3.

h

h

For each of these instances, we have: OPT = i =1 (γ2i −1 + γ2i ), while v ( S A ) = i =1 γ2i −1 . When the values of γ1 , . . . , γ2h are very large, and values γ2i −1 and γ2i , for all i, are very close, the approximation ratio tends to 1/2. An example for h = 3 is given in Fig. 6. 7. Experimentations In this section, we provide some experimentation results, in order to evaluate the average behavior of the algorithm, on randomly generated instances. 7.1. Instances generator The instances generator takes the following inputs: the number of jobs N, the number of delivery dates K , a parameter A ∈ ]0, 1], a parameter R ∈ ]0, 1], and a parameter G ∈ N. The processing times of the jobs are first generated, each being picked randomly from a uniform distribution {10, . . . , 99}. The delivery dates are generated as follows. First, a value dk is picked randomly in a uniform distribution {1, . . . , 1000 }n. Then, we set 1 = d1 , and k = k−1 + dk , k = 2, . . . , K . Finally, we obtain each delivery date as D k = (k / K ) × A × i =1 p i , k = 1, . . . , K . The release date r i associated with processing time p i is chosen in one of the intervals R k = [ D k−1 , D k−1 + R × D 1 ], k = 1, . . . , K . To avoid J i to be trivially late (i.e. r i + p i > D K ), r i is chosen as follows. First, determine in which intervals R k the value r i can be chosen. Let ki = maxk=1,..., K {k| D k−1 + p i ≤ D K }. An interval R k is chosen randomly among R 1 , . . . , R ki (each with a probability of 1/ki ). Finally, r i is picked randomly from R k (uniform distribution), however by avoiding the possibly forbidden dates of R ki . Values γ1 , . . . , γ K are generated from parameter G: first, γ K is picked randomly from a uniform distribution {1, . . . , G }. Then, γk = γk+1 + gk , where gk is picked randomly from a uniform distribution {1, . . . , G }, k = K − 1, . . . , 1. The parameters were chosen as follows: K ∈ {3, 5, 8, 10, 20}; N = n × K , where n ∈ {10, 20, 30, 50, 100}; A ∈ {0.3, 0.5, 0.8, 1, 1.2}; R ∈ {0.1, 0.3, 0.5, 0.7, 1}, G ∈ {10, 20, 50, 100}. For each of these 2500 configurations, 10 instances were generated. 7.2. Numerical results In Table 1, mean and worst values of both the approximation measures are provided with regard to the number of delivery dates and the number of jobs. In Table 2, results are presented with regard to values of parameters A and R. We use the following notations:

• UB: upper bound on the payoff of an optimal solution, computed as described in Section 2.3  • RG: ratio on the absolute guarantee, i.e. (UB − v ( S A ))/ kK=2 γk . RG is the ratio between the measured absolute opti-

mality gap and the theoretical absolute guarantee. RG is at most equal to 1, since the absolute guarantee is an upper bound on the absolute optimality gap.

As can be seen from the values of the mean and worst approximation ratios on Tables 1 and 2, the algorithm very often provides an optimal solution. Regarding the absolute approximation guarantee, the values on the fifth column of each table show that the algorithm very often yields solutions which are much better than the worst case (average RG: 0.054). Even though for the majority of our instances, RG is very low, there exist a few instances for which RG is larger (worst RG: 0.63), which is in accordance with the above tightness analysis. However, the solutions provided by the algorithm remain good, as the approximation ratio is always at least equal to 89% for all instances. The approximation ratio guarantee of 1/2 occurs for pathological instances, where few jobs per delivery date can be executed, while in real world instances several jobs per delivery date can be executed. 8. Conclusion We have showed a polynomial time approximation algorithm for a strongly NP-hard scheduling problem with release dates, common delivery dates and a common stepwise job payoff function. The proposed algorithm has an absolute

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.14 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

14

Table 1 Results by number of delivery dates and number of jobs. Number of delivery dates

Number of jobs

Mean ratio ( v ( S A )/UB)

Min ratio ( v ( S A )/UB)

Mean RG

Max RG

3 3 3 3 3 5 5 5 5 5 8 8 8 8 8 10 10 10 10 10 20 20 20 20 20

30 60 90 150 300 50 100 150 250 500 80 160 240 400 800 100 200 300 500 1000 200 400 600 1000 2000

1 1 1 1 1 0.99 1 1 1 1 0.99 1 1 1 1 0.99 1 1 1 1 0.99 0.99 1 1 1

0.9 0.95 0.95 0.98 0.99 0.89 0.94 0.97 0.98 0.99 0.92 0.96 0.97 0.98 0.99 0.93 0.96 0.97 0.99 0.99 0.94 0.96 0.98 0.98 0.99

0.028 0.033 0.033 0.035 0.025 0.046 0.048 0.045 0.053 0.047 0.058 0.059 0.057 0.057 0.061 0.061 0.062 0.064 0.063 0.064 0.07 0.069 0.069 0.07 0.066

0.48 0.45 0.58 0.6 0.62 0.47 0.63 0.44 0.63 0.54 0.39 0.4 0.42 0.44 0.43 0.35 0.35 0.42 0.4 0.45 0.31 0.34 0.29 0.43 0.37

Mean values

1

0.054

Table 2 Results by values of parameters A and R. A

R

Mean ratio ( v ( S A )/UB)

Min ratio ( v ( S A )/UB)

Mean RG

Max RG

0.3 0.3 0.3 0.3 0.3 0.5 0.5 0.5 0.5 0.5 0.8 0.8 0.8 0.8 0.8 1 1 1 1 1 1.2 1.2 1.2 1.2 1.2

0.1 0.3 0.5 0.7 1 0.1 0.3 0.5 0.7 1 0.1 0.3 0.5 0.7 1 0.1 0.3 0.5 0.7 1 0.1 0.3 0.5 0.7 1

1 0.99 0.99 0.99 0.99 1 1 1 0.99 0.99 1 1 1 1 0.99 1 1 1 1 1 1 1 1 1 1

0.94 0.92 0.93 0.92 0.89 0.96 0.96 0.94 0.93 0.92 0.97 0.98 0.96 0.95 0.95 0.98 0.97 0.96 0.97 0.96 0.99 0.99 0.97 0.98 0.96

0.047 0.054 0.067 0.084 0.16 0.041 0.053 0.059 0.08 0.16 0.031 0.036 0.046 0.056 0.13 0.014 0.019 0.024 0.031 0.077 0.005 0.0075 0.0093 0.014 0.033

0.27 0.33 0.4 0.42 0.63 0.27 0.39 0.39 0.44 0.63 0.24 0.34 0.38 0.37 0.6 0.18 0.27 0.34 0.29 0.45 0.14 0.27 0.28 0.31 0.42

Mean values

1

0.054

performance guarantee, which can be interpreted as follows. At each delivery date, the difference between the number of jobs delivered in the optimal solution and in the approximate solution is at most 1. Moreover, the approximation algorithm is simple and has a time complexity of O ( K N log N ), N being the number of jobs and K the number of delivery dates. We also showed that this algorithm has a worst-case approximation ratio equal to 1/2. These two approximation guarantees

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.15 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

15

are tight. Finally, experimental results on randomly generated instances show that in practice the approximation algorithm yields much better results than its worst cases. Future research might investigate the existence of a PTAS for this problem, which remains an open question. Further research could be conducted on some problems with stepwise payoff functions where the breakpoints are common to all jobs, and the number of breakpoints is fixed. For the single machine problem with common stepwise payoff functions and release dates, a pseudopolynomial time algorithm is given in [2]. Janiak and Krysiak [5] provide a pseudopolynomial time algorithm for the single machine problem where stepwise payoff functions are job dependent, except for the breakpoints that are common to all jobs, and without release dates. For these problems, it would be interesting to investigate whether there exists an FPTAS derived from the pseudopolynomial time algorithms. Moreover, we are not aware of other approximation algorithms on scheduling problems with job stepwise payoff/cost functions. The most general job stepwise cost function criterion is the one where each job has its own stepwise cost function. For scheduling problems with this objective function or its particular cases, with or without release dates, on single or parallel machines, or on shop configurations, it would be interesting to design approximation algorithms with performance guarantees. Acknowledgements This work was supported by FUI project Dem@tFactory, financed by DGCIS, Conseil général de Seine-et-Marne, Conseil général de Seine-Saint-Denis, Ville de Paris, Pôle de compétitivité Cap Digital. Appendix A. Proofs Proof of Lemma 1. By contradiction, suppose that there exist two jobs J d and J l satisfying the hypotheses of Lemma 1, and such that pd < pl . By Property 3, the jobs of Jl + ∩ J ( S kA ) are exactly the jobs scheduled after J l in S kA . Let bl be the

starting time of J l in S kA . Since we only focus on interval [0, D k ], we can assume without loss of generality that S kA is a

right-shifted block completing at D k . We have: bl = D k − p (Jl+ ∩ J ( S kA )) − pl (see Fig. 7(a)). Let us now consider S D k , and suppose that we remove from it all the jobs, except J d and the jobs of Jl+ ∩ J ( S kA ). We obtain a feasible (partial) schedule (see Fig. 7(b)). If the jobs of the obtained schedule are then right-shifted so that the last one completes at D k , then the schedule remains feasible. Moreover, the obtained schedule S includes J d and the jobs of Jl+ ∩ J ( S kA ), and starts at bl + pl − pd > bl , since pd < pl . Therefore, there exists a feasible partial schedule S˜ kA (see Fig. 7(c)) without idle times

and that completes at D k , scheduling the following jobs: first the jobs of Jl − ∩ J ( S kA ) in the same order as in S kA , followed

by S. Moreover, S˜ kA schedules only jobs of J all \J ( S 1A:k−1 ), into [C max ( S kA−1 ), D k ]; and since J ( S˜ kA ) = (J ( S kA )\{ J l }) ∪ { J d }, we have that |J ( S˜ kA )| = |J ( S kA )|, and that p ( S˜ kA ) < p ( S kA ), since pd < pl . This is in contradiction with Property 2 on S kA .

2

Proof of Lemma 2. We assume by contradiction that there exists at least one job in J ( S kA )\J ( S D k ). Let J l be the last scheduled job in S kA among the jobs of J ( S kA )\J ( S D k ). We prove the following three claims. Claim 1. For all J d ∈ J ( S D k )\J ( S 1A:k ):

• d < l (α ); • pd > pl (β). Claim 2. S˜ D k = SDD(J ( S D k ) ∪ { J l }, 0, D k ) is equal to S D k . Claim 3. J l belongs to S˜ D k = SDD(J ( S D k ) ∪ { J l }, 0, D k ). Claims 2 and 3 imply that J l belongs to S D k , which is in contradiction with the initial assumption. Let us prove Claims 1, 2, 3. We assume, without loss of generality, that S kA is a right-shifted block completing at D k . Claim 1. For all J d ∈ J ( S D k )\J ( S 1A:k ):

• d < l (α ); • pd > pl (β). Proof. By hypothesis, there exists at least one job in J ( S kA )\J ( S D k ) ⊆ J ( S 1A:k )\J ( S D k ). Then, there exists at least one job

in J ( S D k )\J ( S 1A:k ), since |J ( S D k )| ≥ |J ( S 1A:k )|, from Property 1 on S D k .

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

16

[m3G; v1.141; Prn:17/10/2014; 11:25] P.16 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

Fig. 7. Illustrating the proof of Lemma 1 on an example where k = 3. The squared jobs are the jobs of Jl+ ∩ J ( S kA ).

Recall that J l is the last scheduled job in S kA among the jobs of J ( S kA )\J ( S D k ). We have: J l ∈ J ( S kA ), and all the jobs

scheduled after J l in S kA belong to S D k : Jl+ ∩ J ( S kA ) ⊆ J ( S D k ). Then, by Lemma 1, for any d such that J d ∈ J ( S D k )\J ( S 1A:k ) we have pd ≥ pl . Let us show that d < l. By contradiction, suppose that d ≥ l. Since J l ∈ J ( S 1A:k ) while J d ∈ / J ( S 1A:k ), d = l. Therefore, d > l, Dk which implies rd ≥ rl . Hence, and since pd ≥ pl , J d can be replaced by J l in S , which yields a schedule S D k with as many jobs as in S D k . There are two cases. If pd > pl , then p ( S D k ) < p ( S D k ), which is in contradiction with Property 2 on S D k . Otherwise, pd = pl , and then p ( S D k ) = p ( S D k ) but S D k contains a job with a smaller index ( J l , instead of J d in S D k ), which Dk is in contradiction with Property 4 on S . Hence, d < l (which implies rd ≤ rl ). Let us show that pd > pl . Since pd ≥ pl , we suppose by contradiction that pd = pl . Then, J l can be replaced by J d in S kA , which yields a schedule with as many jobs as S kA , whose total processing time is p ( S kA ), and containing a job with a smaller index. This is in contradiction with Property 4 on S kA . Thus, pd > pl .

2

Claim 2. S˜ D k = SDD(J ( S D k ) ∪ { J l }, 0, D k ) is equal to S D k . Proof. Fig. 8 shows the distribution of the jobs in S 1A:k . From Property 3 on S Aj , the jobs of S Aj are scheduled in ERD order, j = 1, . . . , k. We call S˜ D k the partial schedule obtained by applying the SDD-algorithm on the jobs of J ( S D k ) ∪ { J l } between 0 and D k : S˜ D k = SDD(J ( S D k ) ∪ { J l }, 0, D k ). We show next that S˜ D k = S D k .

JID:TCS AID:9907 /FLA

Doctopic: Algorithms, automata, complexity and games

[m3G; v1.141; Prn:17/10/2014; 11:25] P.17 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

17

Fig. 8. The position of J l and the other jobs in S 1A:k , on an example where k = 3.

Since the jobs of J ( S D k ) ∪ { J l } are a subset of the total set of jobs J all , no more of |J ( S D k )| of them can be scheduled between 0 and D k , in S˜ D k , from Property 1 on S D k . Moreover, at least |J ( S D k )| of them are scheduled in S˜ D k (since the jobs of J ( S D k ) can be scheduled between 0 and D k ). Hence, |J ( S˜ D k )| = |J ( S D k )|. Let us prove that J ( S˜ D k ) = J ( S D k ). This will imply S˜ D k = S D k , since both schedules complete at D k without idle times, and they both schedule the same set of jobs in ERD order (i.e. increasing indices). Suppose by contradiction that J ( S˜ D k ) = J ( S D k ). Since J l is the only job among those of J ( S D k ) ∪ { J l } that is not in J ( S D k ), and since |J ( S˜ D k )| = |J ( S D k )|, we deduce that J l ∈ J ( S˜ D k ) and that there exists one job in J ( S D k )\J ( S˜ D k ). Moreover, from Properties 2 and 4 on S˜ D k , one of the following assertions holds:

• p ( S˜ D k ) < p ( S D k ) (which is in contradiction with Property 2 on S D k ), or • p ( S˜ D k ) = p ( S D k ) and l is smaller than the index of the job of J ( S D k )\J ( S˜ D k ) (which is in contradiction with Property 4 on S D k ).

Therefore, S˜ D k = S D k .

2

Claim 3. J l belongs to S˜ D k = SDD(J ( S D k ) ∪ { J l }, 0, D k ). Proof. By examining the construction of S˜ D k , we show that J l ∈ J ( S˜ D k ). We first show that J l is added to the current sequence in SDD-algorithm when constructing S˜ D k . Then, we show that it is never removed from the current sequence. Notation. Let J l+ be the job of J ( S D k ) ∪ { J l } that immediately follows J l in ERD order. When constructing S˜ D k with the SDD-algorithm on the set of jobs J ( S D k ) ∪ { J l } on the interval [0, D k ], the jobs are considered in reverse ERD order (i.e. decreasing indices), and are added (or not) to the current sequence, possibly by removing another job. Any job J d ∈ J ( S D k )\J ( S 1A:k ) is such that d < l from inequality (α ) of Claim 1, and is thus considered after J l when constructing S˜ D k . Therefore, Jl+ ∩ J ( S D k ) ⊆ Jl+ ∩ J ( S 1A:k ). Thus, when J l is considered, the jobs of the current sequence Γl+ all belong to Jl + ∩ J ( S 1A:k ). We now show that J l is added to Γl+ without removing any other job.

A Notations. Let B l+ be the starting time of S (Γl+ ), and b+ 1 denote the starting time of the first scheduled job in S 1:k among the jobs of Jl+ . The starting time of J l in S kA is denoted by bl (cf. Fig. 8).

We have: B l+ = D k − p (Γl+ ) ≥ D k − p (Jl+ ∩ J ( S 1A:k )) since J (Γl+ ) ⊆ Jl+ ∩ J ( S 1A:k ). Let us show that J l can be scheduled at the beginning of S (Γl+ ), i.e.: rl ≤ B l+ − pl . Two cases have to be considered. + + A A + If b+ 1 > bl , i.e. no jobs of Jl are scheduled before J l in S 1:k , then we have: B l − pl ≥ D k − p (Jl ∩ J ( S k )) − pl = bl ≥ rl ,

otherwise S 1A:k would not be feasible.

k + + A + Otherwise, if b+ j =1 p (Jl ∩ J ( S j )) ≥ b 1 , since J l 1 < bl (as in the example of Fig. 8), we have: B l − pl ≥ D k − pl −

+ + A and all the jobs of Jl + ∩ J ( S 1A:k ) are scheduled between b+ 1 and D k in S 1:k , by definition of b 1 . And b 1 ≥ rl , since a job + + + A of Jl starts at b1 in the feasible solution S 1:k and, by ERD order, the jobs of Jl have release dates greater than or equal to rl . Hence, rl ≤ B l+ − pl . Therefore, J l is added to Γl+ without removing any job. Let us show that J l is never removed from Γl = J l .Γl+ . By contradiction, let us suppose that J l is removed from Γl . Then, by the SDD-algorithm, there exists J x ∈ J ( S D k ), x < l, such that J l belongs to Γx+ but not to Γx (i.e. J l is removed from the current partial schedule in order to add J x ). Suppose that J x does not belong to S 1A:k : J x ∈ J ( S D k )\J ( S 1A:k ). Therefore, by inequality (β) of Claim 1, p x > pl , which

implies that J l is not removed in order to insert J x in S D k . Thus, J x ∈ J ( S 1A:k ).

Let us show that there exists in Γx+ at least one job that does not belong to J ( S 1A:k ). By contradiction, suppose that

J (Γx+ ) ⊆ J ( S 1A:k ). Moreover, J x ∈ J ( S 1A:k ). Let us consider S 1A:k , and remove from it all the jobs except those of J (Γx+ ) ∪ { J x }: we obtain a feasible schedule (see Fig. 9). If we reorder the jobs by increasing indices (i.e. in ERD order), and right-shift them in order to obtain a unique block that completes in D k , we still have a feasible schedule (see Fig. 9). Notice that this schedule is exactly the same obtained by adding J x at the beginning of S (Γx+ ). Hence, no job of Γx+ is removed in order

JID:TCS AID:9907 /FLA Doctopic: Algorithms, automata, complexity and games

18

[m3G; v1.141; Prn:17/10/2014; 11:25] P.18 (1-18)

Y. Seddik et al. / Theoretical Computer Science ••• (••••) •••–•••

Fig. 9. Illustrating the proof of Claim 3 of Lemma 2, on an example where k = 3: the striped jobs are the jobs of J (Γx+ ).

to add J x , and in particular J l is not removed from Γx+ , which is in contradiction with the assumed hypothesis. Therefore, there exists at least one job J d ∈ J (Γx+ )\J ( S 1A:k ). From inequality (β) of Claim 1, we have pd > pl . Hence, there exists at least one job in Γx+ that has a longer processing time than pl . Therefore, J l is not removed in order to add J x . 2 Claims 2 and 3 imply that J l ∈ J ( S˜ D k ) = J ( S D k ), which is in contradiction with the hypothesis J l ∈ / J ( S D k ). Hence, the result of Lemma 2 holds. 2 Proof of Lemma 3. By contradiction, suppose that there exists a job J i ∈ J ( S D k )\J ( S 1A:k ) such that p i ≤ yk . We assume without loss of generality that S kA is a block completing at D k . Hence, yk = B ( S kA ) − C max ( S kA−1 ).

Let us consider S D k , and suppose that we remove from it all the jobs, except J i and the jobs of S kA (J ( S kA ) ⊆ J ( S D k ) by Lemma 2). We obtain a feasible (partial) schedule. If the remaining jobs are then right-shifted so that the last one completes at D k , the schedule still remains feasible. Moreover, the obtained schedule S includes J i and the jobs of S kA , and starts at B ( S kA ) − p i ≥ C max ( S kA−1 ), because p i ≤ yk . Therefore, S kA can be replaced by S in S 1A:k , and |J ( S )| > |J ( S kA )|. This leads to a contradiction with Property 1 on S kA . 2 References [1] R.L. Graham, E.L. Lawler, J.K. Lenstra, A.R. Kan, Optimization and approximation in deterministic sequencing and scheduling: a survey, Ann. Discrete Math. 5 (2) (1979) 287–326. [2] Y. Seddik, C. Gonzales, S. Kedad-Sidhoum, Single machine scheduling with delivery dates and cumulative payoffs, J. Sched. 16 (3) (2013) 313–329. [3] B. Detienne, S. Dauzère-Pérès, C. Yugma, An exact approach for scheduling jobs with regular step cost functions on a single machine, Comput. Oper. Res. 39 (5) (2012) 1033–1043. [4] J. Curry, B. Peters, Rescheduling parallel machines with stepwise increasing tardiness and machine assignment stability objectives, Int. J. Prod. Res. 43 (15) (2005) 3231–3246. [5] A. Janiak, T. Krysiak, Single processor scheduling with job values depending on their completion times, J. Sched. 10 (2) (2007) 129–138. [6] J.M. Moore, An n job, one machine sequencing algorithm for minimizing the number of late jobs, Manag. Sci. 15 (1) (1968) 102–109. [7] C.-T. Tseng, Y.-C. Chou, W.-Y. Chen, A variable neighborhood search for the single machine total stepwise tardiness problem, in: Proceedings of the 2010 International Conference on Engineering, Project and Production Management, Pingtun, Taiwan, 2010, pp. 101–108. [8] W.-H. Yang, Sequencing jobs subject to multiple common due dates, in: Proceedings of the 10th Asia Pacific Industrial Engineering & Management Systems Conference, 2009, pp. 1041–1050. [9] A. Janiak, T. Krysiak, Scheduling jobs with values dependent on their completion times, Int. J. Prod. Econ. 135 (1) (2012) 231–241. [10] B. Detienne, S. Dauzère-Pérès, C. Yugma, Scheduling jobs on parallel machines to minimize a regular step total cost function, J. Sched. 14 (6) (2011) 523–538. [11] G. Sahin, ¸ R.K. Ahuja, Single-machine scheduling with stepwise tardiness costs and release times, J. Ind. Manag. Optim. 7 (2011) 825–848. [12] N.G. Hall, M. Lesaoana, C.N. Potts, Scheduling with fixed delivery dates, Oper. Res. 49 (1) (2001) 134–144. [13] N.G. Hall, S.P. Sethi, C. Sriskandarajah, On the complexity of generalized due date scheduling problems, European J. Oper. Res. 51 (1) (1991) 100–109. [14] E. Lawler, Optimal sequencing of a single machine subject to precedence constraints, Manag. Sci. 19 (1973) 544–546.