Expert Systems with Applications 38 (2011) 8990–8998
Contents lists available at ScienceDirect
Expert Systems with Applications journal homepage: www.elsevier.com/locate/eswa
A self-adaptive local search algorithm for the classical vehicle routing problem Cigdem Alabas-Uslu a,⇑, Berna Dengiz b a b
TC Okan University, Department of Industrial Engineering, 34959 Istanbul, Turkey Baskent University, Department of Industrial Engineering, 06530 Ankara, Turkey
a r t i c l e
i n f o
Keywords: Vehicle routing Metaheuristics Parameter tuning Self-adaptation
a b s t r a c t The purpose of this study is introduction of a local search heuristic free from parameter tuning to solve classical vehicle routing problem (VRP). The VRP can be described as the problem of designing optimal delivery of routes from one depot to a number of customers under the limitations of side constraints to minimize the total traveling cost. The importance of this problem comes from practical as well as theoretical point of view. The proposed heuristic, self-adaptive local search (SALS), has one generic parameter which is learnt throughout the search process. Computational experiments confirm that SALS gives high qualified solutions to the VRP and ensures at least an average performance, in terms of efficiency and effectiveness, on the problem when compared with the recent and sophisticated approaches from the literature. The most important advantage of the proposed heuristic is the application convenience for the end-users. SALS also is flexible that can be easily applied to variations of VRP. Ó 2011 Elsevier Ltd. All rights reserved.
1. Introduction The classical VRP can be described as the problem of designing optimal delivery of routes from one depot to a number of customers under the limitations of side constraints to minimize the total traveling cost. Graph theoretic definition of the problem is as follows: Let G = (V, A) be a complete graph, where V = {1, . . . , n + 1} is the vertex set and A is the arc set. Vertices i = 2, . . . , n + 1 correspond to the customers, whereas vertex 1 corresponds to the depot. A nonnegative cost, cij, associated with each arc (i, j) e A represents the travel cost between vertex i and vertex j. Each customer i is associated with a known nonnegative demand, di, to be delivered. The total demand assigned to any route may not exceed the vehicle capacity, Q. A fleet of m identical vehicles is located at the depot. Another constraint which sometimes included in the VRP is that the total duration of each route does not exceed a distance limit, L. In the capacity and/or distance constrained VRP, each of the m routes starts and terminates at the depot and each customer is served exactly once by exactly one vehicle. Since Dantzig and Ramser (1959) introduced the problem in 1959, VRP has been one of the most important, and studied, NP-hard problems. VRP also plays an important role in the distribution management and logistics because the delivery (collection) cost has a significant portion in the total logistics cost. Many complications of the VRP, such as heterogeneous fleet, multiple depots, different time window for
⇑ Corresponding author. E-mail addresses:
[email protected] (C. Alabas-Uslu), bdengiz@baskent. edu.tr (B. Dengiz). 0957-4174/$ - see front matter Ó 2011 Elsevier Ltd. All rights reserved. doi:10.1016/j.eswa.2011.01.116
each customer, multiple objectives, make the problem more difficult to solve. The literature on the VRP is vast due to the practical as well as the theoretical importance. Proposed solution methods for VRP can be classified as exact and approximate algorithms. A detailed survey on exact methods such as branch-and-bound, branch-and-cut and dynamic programming can be found in Toth and Vigo (2002). As for the approximate methods, there exist classical heuristics surveyed by Laporte and Semet (2002) and metaheuristics recently surveyed by Cordeau, Gendreau, Hertz, Laporte, and Sormany (2005). Especially, effective and efficient metaheuristic algorithms, such as adaptive memory programming (Tarantilis, 2005; Tarantilis & Kiranoudis, 2002), threshold accepting (Tarantilis, Kiranoudis, & Vassiliadis 2002a, 2002b), granular tabu search (Toth & Vigo, 2003), evolutionary algorithm (Prins, 2004), ant colony (Reimann, Doerner, & Hartl, 2004), record-to-record travel algorithm (Li, Golden, & Wasil, 2005), adaptive large neighborhood search (Pisinger & Ropke, 2007), variable neighborhood search (Kytöjoki, Nuortio, Bräysy, & Gendreau, 2007), active-guided evolution strategies (Mester & Bräysy, 2007), have been proposed recent years for the VRP. Although metaheuristics are found effective and efficient for the problem, they still suffer from parameter optimization. A deep knowledge of the problem structure or a lengthy trial-and-error process is needed to select the parameter set carefully. The best parameter set is usually re-determined for the each of the problem instances considering the application area, size or input data of the problem. Adenso-Diaz and Laguna (2006) state that about 10% of the total time dedicated to designing and testing of a new heuristic is spent for development, and the remaining 90% is consumed fine-
8991
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998
tuning of parameters. An alternative way to tuning parameters is by controlling them throughout the run. Heuristics which are used by this way are generally called adaptive, reactive or self-tuning heuristics. This kind of heuristics utilizes different forms of feedback information to perform a learning process of the parameter combination during the search. Self-adaptive heuristics are achieved for evolutionary algorithms earlier than local search based algorithms. Eiben, Hinterding, and Michalewicz (1999) present a comprehensive study to classify parameter control methods for evolutionary algorithms and survey various forms of control methods. The pioneering attempt to develop a self-adaptive mechanism for the local search based metaheuristics is the reactive tabu search by Battiti and Tecchiolli, (1994). Recently, Birattari (2009) adopts a machine learning perspective to the tuning problem of metaheuristics and develops a generic algorithm for the tuning. There are numerous proposed heuristics for the VRP (Li et al., 2005; Prins, 2004; Reimann et al., 2004; Russell & Chiang, 2006; Tarantilis et al., 2002a, 2002b) which use dynamic parameter structures. In this paper, a heuristic algorithm with a self-adaptive mechanism is developed for solving the classical VRP escaping from the difficulty of parameter optimization. The proposed heuristic, named self-adaptive local search (SALS), has only one generic parameter named acceptance parameter, H. Parameter H is calculated and updated self-adaptively throughout the search process to improve effectiveness of the algorithm using the response surface information which comes from the problem and the performance measure of the algorithm. If SALS algorithm is allowed to search only feasible solutions of VRP, it is completely free from the parameter tuning. The main difference between SALS algorithm and mentioned algorithms above which use dynamic parameter structures is that the parameter H of SALS never requires tuning as the other algorithms do. Allowing of searching infeasible solutions yields a set of parameters for SALS which will be explained in Section 2.1. However, main idea behind SALS algorithm is its application simplicity relieving the users from the burden of optimization of generic parameters. In this study, the performance of SALS is demonstrated comparing the most recent and sophisticated algorithms proposed in the classical VRP literature. The rest of this paper is organized as follows. Implementation of SALS on the VRP is explained in Section 2. Experimental results which compare the performances of SALS and the recent metaheuristics for the VRP are provided in Section 3. Finally, the last section presents the conclusions.
2. Implementation of SALS on the VRP In this study, a solution point X for the VRP, is represented as a vector (x1, x2, . . . , xD), which corresponds the sequence of the nodes, with dimension D (D = n + m where n is number of customers and m is the number of vehicles). The node 1 corresponds to depot. The other nodes are customers. For example, let a solution to
9 n and 2 m problem is X = [1 2 4 6 1 3 7 9 5 8 10 1]. First route (i.e. vehicle 1) starts from depot 1, visits customers indexed 2, 4, 6, respectively, and reaches to depot 1. Similarly, second vehicle visits customers 3, 7, 9, 5, 8, 10 starting and terminating at depot 1. Neighborhood of a solution is obtained using five different moving types: Adjacent swap (MAS), general swap (MGS), single insertion (MSI), block insertion (MBI) and reverse location (MRL). These moving types are the most commonly used types of perturbation schemes for generating neighbor solutions and an analysis of them can be found in Tian, Ma, and Zhang (1999) for a SA algorithm. The definitions of the moves and neighborhood sizes obtained by these moves are given in Table 1. MAS exchanges the positions of two adjacent nodes, while MGS swaps the positions of any node pair. MSI selects a node and inserts it between two adjacent nodes, on the other hand MBI selects a subsequence of nodes to insert. MRL selects a subsequence of nodes and relocates these nodes in the reverse order. All of the selections to accomplish these moving types are random. Fig. 1 also shows network representation examples of the moves for a 15 n and 4 m VRP. As mentioned before SALS is a modified local search algorithm. For a minimization problem, local search starts from an initial solution, Xz , as its current solution, X. It then explores the neighborhood of this solution, N(X), using a certain moving mechanism to determine the best neighbor solution, X0 . The neighbor X0 , which improves the objective value of current solution satisfying f(X0 ) < f(X), is then accepted to replace with X. The search process continues until no improvement can be made and finally obtained solution is a local optimum. SALS also initiates the current solution, X, with Xz and searches a subset of the whole neighborhood of X iteratively. At iteration i, a subset N0 (X: X(s), s = 1, . . . , 5), which includes 5 different neighbors from the N(X), is randomly generated using each of the moving types described above. The best neighbor, X0 , among X(s) solutions (s = 1, . . . , 5) with best objective value is accepted as a new current solution if the following acceptance condition is satisfied:
If f ðX0 Þ 6 Hf ðXÞ then X
X0
For the VRP problem, function f(X) corresponds to the total traveling cost of the solution X at iteration i. H, which is explained below, is the self-adaptive parameter of SALS. When a trial solution is accepted as current solution, algorithm progresses to the next iteration. Otherwise, a new subset N0 (X) is generated randomly with the satisfaction expectation of the acceptance condition. If the total number of rejected neighbors reaches the neighborhood size of the current solution, |N(X)|, respect to the moving mechanism under consideration, value of H is increased using the equation H = H + a1a2. Thus, the search region around the current solution X is expanded to obtain an acceptable neighbor solution. The sampling of neighbors is implemented with replacement to avoid excessive consumption of computer memory. SALS algorithm modifies the local search with making two contributions: Firstly, SALS algorithm relaxes the hard acceptance condition of local search, f(X0 ) < f(X), introducing the new condition f(X0 ) 6 Hf(X). The
Table 1 Moving types and neighborhood sizes. Type
Definition
MAS MGS
Nodes xi and xj are interchanged for i, j = 1, . . . , n and abs(i j) = 1 Nodes xi and xj are interchanged, for i, j = 1, . . . , n and abs(i j) > 1
MSI MBI
Node xi is inserted between nodes xj and xj+1, for i = 1, . . . , n, j = 1, . . . , n 1 and abs(i j) > 1 A subsequence of nodes from xi to xi+b is inserted between nodes xj and xj+1, for i = 1, . . . , n 1 b, j = i + b + 1, . . . , n 1 and b = 1, . . . , n 2
MRL
A subsequence of nodes from xi to xj is sequenced in the reverse order for i, j = 1, . . . , n and abs(i j) > 1
Neighborhood size |NAS(X)| = (n 1) jN GS ðXÞj ¼ ðn1Þðn2Þ 2 |NSI(X)| = (n 1)(n 2) 8P ðn2Þ=2 > ðn 2iÞ2 ; > > < i¼1 if n is even P jN BI ðXÞj ¼ ðn3Þ=2 ðn 2iÞ2 þ 1; > i¼1 > > : if n is odd jN RL ðXÞj ¼ ðn1Þðn2Þ 2
8992
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998 5
3
6
5
3
4
4
7
2
6
2
7
8
1
8
1
16
16 9
15 13
9
15
12
12 13
10
14
10
14
11
11
X = [ 1 16 15 14 13 1 12 11 10 9 1 8 7 6 1 5 4 3 2 1 ]
X(3) = [ 1 16 15 14 1 12 13 11 10 9 1 8 7 6 1 5 4 3 2 1 ]
(a) 5
3
(d) 6
5
3
4
4
7
2
6
7
2
8
1
8
1
16
16
9
15
9
15
12
13
12
13
10
14
10
14
11
11
X (1) = [ 1 16 15 14 13 1 12 11 10 9 1 8 7 6 1 4 5 3 2 1 ]
X(4) = [ 1 14 13 1 12 11 10 9 1 8 7 6 1 5 4 3 2 16 15 1 ]
(e)
(b) 5
3
5
3
6
6
4
4
7
2
7
2
8
1
8
1 16
16 9
15
9
15
12 13
13
10
14
10
14 11
11
X (2) = [ 1 16 15 14 12 1 13 11 10 9 1 8 7 6 1 5 4 3 2 1 ]
12
X(5) = [ 1 16 15 14 13 1 10 11 12 9 1 8 7 6 1 5 4 3 2 1 ]
(c)
(f)
Fig. 1. (a) Current solution X, (b) MAS(X: x16/x17), (c) MGS(X: x5/x7), (d) MSI(X: x5/x7 x8), (e) MBI(X: x2 x3/x19 x20) and (f) MRL(X: x7 x9).
second, algorithm considers a small subset of the neighborhood instead of the whole neighborhood search. The first contribution of SALS algorithm reminds the record-to-record (RRT) deterministic annealing algorithm (Dueck, 1993). However, the acceptance parameter of RRT, say b, is fixed and the algorithm accepts a ranðiÞ domly selected trial solution, X00 , if it satisfies f ðX00 Þ 6 bf ðXb Þ, ðiÞ where Xb is the best solution observed until iteration i. Parameter b must be tuned before the running of algorithm. On the other hand, the only parameter of SALS, H, is learnt throughout the search process taking into account two criteria: the number of improved solutions obtained during the search process and the quality of these solutions. Hence, the measures given by Eqs. (1) and (2) are introduced, where C(L(i)) is the number of improved solutions obtained until iteration i. C(L(i)) is counted as ðiÞ C(L(i)) = C(L(i)) + 1, whenever f ðX0 Þ < f ðXb Þ throughout the search. Calculations of a1 and a2 rely on the observations obtained from the pre-examinations. SALS assumes that f(X) > 0, for the whole solution space. Naturally, a1 and a2 take values between 0 and 1. When a new improved solution is obtained, a1 decreases while a2 increases. High values of C(L(i)) can be thought as an indicator of a search space which has many local optima. Conversely, in the case of smooth search space, it is expected that C(L(i)) counter takes low values.
ðiÞ
a1 ¼
f ðXb Þ f ðXz Þ
ð1Þ
a2 ¼
CðLðiÞ Þ i
ð2Þ
Parameter H is calculated during the search by the Eq. (3). In this way, the parameter calculates the border of the search region around the current solution X in terms of the number of improved solutions and the solution quality. The basic steps of SALS are given ðiÞ in Fig. 2. For iteration i = 1, f ðXb Þ ¼ f ðXz Þ and C(L(i)) = i, hence H is equal to 2. During the initial iterations of the algorithm, the search is almost random for the relatively high values of H. For the subsequent iterations, H decreases depending on values of a1 and a2. However, decreasing in H is nonmonotonic. When the search process encounters a region which has many local optima, H ðiÞ may increase depending on C(L(i)), f ðXb Þ and i. By this means, H provides a diversification effect, occasionally, to climb out of these local optima. On the other hand, if the region is smooth, C(L(i)) takes low values and then H tends to be smaller relative to the its pre-value. The smaller H value will cause an intensification effect on the search in such a case. For an example, let f(Xz) = 1000, ð50Þ f ðXb Þ ¼ 700, and C(L(50)) = 10 for i = 50. According to these obser-
8993
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998
Fig. 2. Steps of SALS algorithm.
vations H calculated as equal to 1.14. Consider that the search process found 30 improved solutions throughout the next 50 iterað100Þ tions and observed that f ðXb Þ ¼ 500 and C(L(100)) = 40 for i = 100. Consequently, the value of H increases from 1.14 to 1.20. ð150Þ Consider the following 50 iterations and f ðXb Þ ¼ 400, (150) C(L ) = 45 (i.e. only 5 improved solutions were obtained throughout the last 50 iterations). The resulting value of H decreases from 1.20 to 1.12. It is expected that while H approaches to 1, the search converges to the set of high quality solutions. For instance, Fig. 3 demonstrates the convergence of SALS for a VRP with 200 n by Christofides and Elion (1969), where DBS is the percentage deviation from the best known solution. As parameter
H approaches to 1, the search process is forced to find high quality solutions. Furthermore, changing of parameter H with respect to the number of iterations is shown in Fig. 4 (the figures have been plotted without initial iterations of the search process to provide clear visibility of the remainder iterations). As seen from the figure, H exponentially decreases as the number of iterations increases
H ¼ 1 þ a1 a2
ð3Þ
An experimental study is undertaken to show the effectiveness of self-adaptive H comparing with the fixed values for H. Three fixed levels, 1.0015, 1.0025, and 1.0035, which have generated reasonably good results in the preliminary experiments, are selected
250
200
DBS
150
100
50
0 1
1.05
1.1
1.15
Fig. 3. Changing of H according to DBS for a 200-size problem.
1.2
8994
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998
1.5 1.45 1.4 1.35 1.3 1.25 1.2 1.15 1.1 1.05 1 0
1000
2000
3000
4000
5000
6000
7000
8000
9000
10000
number of iterations Fig. 4. Changing of H according to number of iterations for a 51-size problem.
Table 2 Comparison of self-adaptive H with the fixed levels. Problem size
Small Moderate Large Average
H 1.0015
1.0025
1.0035
Self-adaptive
1.15 1.02 1.62 1.26
0.63 0.47 4.34 1.81
0.12 2.53 13.47 5.37
0.0 0.48 1.58 0.69
for H. These fixed levels and dynamic calculation of H are tested on the problems by Christofides and Elion (1969). Test problems are classified as small, moderate, and large. Table 2 shows the average percentage deviations from the best-known solutions (DBS) over the 10 runs for the classified problem sets. As seen from the table, self-adaptive H generates higher qualified solutions than the fixed H values for each of the problem sizes, except fixed value 1.0025 for the moderate-sized problems. However, the self-adaptive H outperforms the deterministic control of H according to
Fig. 5. Pseudo-code part to systematize d.
8995
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998 Table 3 The recent metaheuristics for the VRP and their parameters. Study
Algorithm abbreviation
Number of parameters
Tarantilis (2005) Tarantilis et al. (2002a) Tarantilis et al. (2002b) Toth and Vigo (2003) Prins (2004) Reimann et al. (2004) Li et al. (2005) Pisinger and Ropke (2007) Kytöjoki et al. (2007) Mester and Bräysy (2007)
T-AMP TKV-TA1 TKV-TA2 TV-TS P-EA RDH-AC LGW-RRT PR-ALNS KNBG-VNS MB-AGES
8 7 1 7 7 9 5 14 3 9
the overall average results considering all problem sizes. Also, if the fixed H levels are compared with each other, it is seen that each different level generates good solutions to the each different size of the problem. 2.1. Solution feasibility SALS is run under two different cases: allowing only feasible solutions (SALS-F) and allowing also infeasible solutions (SALSINF). Searching only feasible solutions is straightforward: algorithm uses any feasible solution as an initial solution and moves are implemented under the capacity and/or distance constraints. In this study, a feasible initial solution, generated by assigning one vehicle to each customer location, is used. On the other hand, SALS-INF, as usual, modifies costs of the infeasible solutions by 0 equation f (X) = f(X) + d(OC(X) + OD(X)) for a solution point X. In the equation, OC(X) is the sum of the amounts of the over capacities and, similarly, OD(X) is the sum of the amounts of the over distances. The value of penalty coefficient, d, is systematically increased and decreased during the search within the range [dmin, dmax] using a step size, e, as explained in the pseudo-code part given by Fig. 5. SALS does not need parameter setting when it searches only feasible solution regions. The allowing of infeasible solutions necessarily leads to several parameters to direct the search between
feasible and infeasible solution regions. In this application, parameters dmin, dmax and e are determined using a short trial-and-error process since a detailed analysis for tuning of the parameters is incongruous with the core of this paper. Results of the trial-and-error process show that dmax = 3.0 and e = 5 104 generate good solutions for overall of the benchmark problem sets from Christofides and Elion (1969) (set-1) and Golden, Wasil, Kelly, and Chao (1998) (set-2). On the other hand, while dmin = 0.3 causes qualified solutions for all of the problems from set-1, dmin = 0.1 is used for 6 problems from set-2. The remaining 14 problems from set-2 are solved effectively using dmin = 0.03. It should be noted that SALS algorithm, which searches also infeasible solutions using a set of carefully tuned penalty parameters, may generates more qualified solutions for the test problems.
3. Experimental results Table 3 shows the recent metaheuristics, with their abbreviations and the number of parameters, proposed for the VRP. As seen from the table, all of the algorithms need multiple parameters, except TKV-TA2, to be optimized. As our knowledge, there is no metaheuristic algorithm which does not need parameter tuning to solve the VRP instances. SALS is coded in Pascal and executed on a Pentium IV/1000-512 RAM computer. Firstly, performance of the algorithm is tested on the set-1 and then it is compared with most recent algorithms on the set-2. Set-1 contains 7 problems selected from the benchmark problems of Christofides and Elion (1969). These problems can be divided both in random (problem 1, 2, 3, 6, 7) and clustered (problem 4, 5). Set-2 includes larger sized problems than set-1 containing 200–483 customers. The first eight instances of the set-2 have route-length restrictions. Each problem of the set-2 has a geometric structure (see Golden et al., 1998 for details). Table 4 shows comparison of the best and average results of SALS-F and SALS-INF over 10 runs on the set-1. Each run of the algorithm is terminated when the parameter H approaches to 1 (1.001 is found to enough to approach good solutions). As seen from the B-DBS columns of the table, SALS-INF and SALS-F generate best-known solutions to the problems 1, 3, 4, 5, and solutions to
Table 4 Performance of SALS on the set-1.
a b c d e f
Pr.
n
B-COSTa
B-DBSb
B-CPUc
Cost d
DBSe
CPU f
Best known
SALS-INF 1 2 3 4 5 6 7 Average
51 76 101 101 121 151 200
524.61 840.47 826.14 819.56 1042.11 1032.19 1309.72
0 0.6 0 0 0 0.4 1.4 0.34
0.2 0.22 1.27 0.45 4.67 20.3 16.62 6.25
526.41 845.95 830.83 819.56 1043.56 1043.14 1323.63
0.3 1.3 0.6 0 0.1 1.4 2.5 0.89
1.02 8.68 12.21 0.76 18.77 29.12 49.69 17.18
524.61 835.26 826.14 819.56 1042.11 1028.42 1291.29
SALS-F 1 2 3 4 5 6 7 Average
51 76 101 101 121 151 200
524.61 835.45 830.15 819.56 1042.11 1039.96 1328.8
0 0.02 0.5 0 0 1.1 2.9 0.65
0.17 3.5 7.1 0.54 5.37 15.67 42.7 10.72
524.65 845.34 832.47 819.56 1046.17 1045.29 1342.71
0.0008 1.2 0.8 0 0.4 1.6 3.98 1.14
1.58 9.32 13.72 2.06 18.74 28.5 50.19 17.73
524.61 835.26 826.14 819.56 1042.11 1028.42 1291.29
Best solution observed from 10 experiments. Best deviation from best-known solution. CPU time until obtaining the best solution in minutes. Average solution cost. Average deviation from best known solution. CPU time until termination condition in minutes.
8996
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998
Table 5 Best solution results of metaheuristics on set-2a. Pr.
n
TKV-TA1
TKV-TA2
TV-TS
P-EA
RDH-AC
T-AMP
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
241 321 401 481 201 281 361 441 256 324 400 484 253 321 397 481 241 301 361 421
5683.63 8528.80 11199.72 13661.16 6466.68 8429.28 10297.27 11953.93 596.92 765.03 945.20 1143.39 872.66 1102.40 1384.04 1679.50 718.16 1030.54 1408.62 1872.23
5680.16 8512.64 11190.38 13706.78 6460.98 8427.72 10274.19 11968.93 595.35 764.88 945.09 1143.74 871.97 1102.66 1385.59 1677.25 717.40 1032.07 1406.47 1872.87
5736.15 8553.03 11402.75 14910.62 6697.53 8963.32 10547.44 12036.24 593.35 751.66 936.04 1147.14 868.80 1096.18 1369.44 1652.32 711.07 1016.83 1400.96 1915.83
5646.63 8447.92 11036.22 13624.52 6460.98 8412.80 10195.59 11828.78 591.54 751.41 933.04 1133.79 875.16 1086.24 1367.37 1650.94 710.42 1014.80 1376.49 1846.55
5644.02 8449.12 11036.22 13699.11 6460.98 8412.90 10195.59 11828.78 586.87 750.77 927.27 1140.87 865.07 1093.77 1358.21 1635.16 708.76 998.83 1367.20 1822.94
5676.97 8459.91 11036.22 13637.53 6460.98 8414.28 10216.50 11936.16 585.43 746.56 923.17 1130.40 865.01 1086.07 1353.91 1634.74 708.74 1006.90 1371.01 1837.67
19.88
19.56
28.84
9.23
6.08
6.09
DBS Pr
n
LGW-RRT
KNBG-VNS
PR-ALNS
MB-AGES
SALS-INF
SALS-F
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
241 321 401 481 201 281 361 441 256 324 400 484 253 321 397 481 241 301 361 421
5639.36 — — — — — — 11696.55 585.54 743.17 923.11 1116.22 862.38 1083.65 1351.35 1632.47 708.63 1007.86 1378.20 1840.66
5867.84 8476.26 11043.41 13631.72 6460.98 8415.67 10297.66 11872.64 620.67 784.77 986.80 1209.02 925.81 1155.19 1461.49 1742.86 726.01 1077.53 1444.51 1938.12
5650.91 8469.32 11047.01 13635.31 6466.68 8416.13 10181.75 11713.62 585.14 748.89 922.70 1119.06 864.68 1095.40 1359.94 1639.11 708.90 1002.42 1374.24 1830.80
5627.54 8447.92 11036.22 13624.52 6460.98 8412.88 10195.56 11663.55 583.39 741.56 918.45 1107.19 859.11 1081.31 1345.23 1622.69 707.79 998.73 1366.86 1820.09
5638.42 8457.04 11098.93 13816.35 6460.98 8430.66 10209.64 11785.11 585.29 745.25 924.74 1123.29 861.94 1097.49 1356.34 1643.74 709.84 1005.97 1387.93 1872.45
5645.90 8468.85 11113.53 13753.12 6460.98 8427.72 10299.08 11865.69 585.91 746.37 925.26 1116.94 877.54 1102.93 1362.32 1650.34 713.33 1015.11 1377.62 1849.40
5.03
44.91
4.97
0.07
8.07
9.96
DBS a
Bold values show the best known solutions.
the remaining problems are very close to the best-known solutions in the best case. SALS-INF also generates higher quality solutions in average than SALS-F in the best and average cases. SALS-INF obtains best solutions more quickly than SALS-F, while total execution time of the INF and F cases are not so much different each other. One-way ANOVA statistical test is performed to examine statistically difference between SCOLS-INF and SCOLS-F and the results show that solution quality performances of the algorithms are similar at 95% confidence level. To show the effectiveness of the SALS on real-life applications of VRP, the algorithm is tested on the large scale routing problems of the set-2. Ten different randomly generated initial solutions for the each problem of set-2 are used for the experimentations of SALS algorithm. In Table 5, the best-known metaheuristics, listed in Table 3, and SALS (both INF and F cases) are compared on the set-2 taking into account the best solution quality results of the algorithms. As seen in the table, allowing the infeasible solutions in SALS algorithm generates higher quality solutions than the considering with only feasible solutions. While five of the algorithms have better solution quality than SALS-INF, the five remainder algorithms are outperformed by SALS-INF. In addition, SALS-F has higher solution quality than four of the metaheuristics in average. MB-AGES algorithm is the winner in terms of the solution quality. Fig. 6 displays the per-
centage solution quality performances, calculated by ð100 mÞ, of the metaheuristics to extract the information given in Table 5. The figure also shows visually that SALS algorithm (both of INF and F cases) exhibits a moderate overall solution quality relative to the other heuristics. We cannot compare performances of the heuristics statistically, since the values reported in Table 5 are the best experimental results of the corresponding algorithm and we are not able to have their entire set of experiments. Computational time results of the algorithms are also reported in Table 6. The CPU results in the table are taken from corresponding study. Since the algorithms utilize different computer systems, a fair comparison of CPU results is difficult. To overcome this difficulty, the CPU times of the algorithms are transformed considering appropriate scaling factors. The scaling factors are calculated using CPU speeds of the computers in Mflops/ seconds which are reported by Dongarra (2010). The scaled runtime results roughly show that effectiveness of SALS algorithm (both of INF and F cases) also is moderate relative to the other heuristics under consideration. KNBG-VNS is the fastest algorithm for the set-2. On the other hand, it should be noted that all of the algorithms except SALS-F have to spend time to tune their generic parameters. In our opinion, the most important advantage of our SALS-F algorithm is its conceptual simplicity, which makes it flexible to apply different problem instances without parameter tuning.
8997
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998
100% 90% 80% Overall Solution Quality Performance
70% 60% 50% 40% 30% 20% 10% SALS-F
SALS-INF
MB-AGES
PR-ALNS
KNBG-VNS
LGW-RRT
T-AMP
RDH-AC
P-EA
TV-TS
TKV-TA2
TKV-TA1
0%
Algorithm Fig. 6. Solution quality performances of the recent metaheuristics on the VRP.
Table 6 Comparison of the best-known heuristics on the set-2. Study
Algorithm abbreviation
Computer used (MHz)
CPU (min)
Scaling factor
Scaled CPU (min)
DBS
Tarantilis (2005) Tarantilis et al. (2002a) Tarantilis et al. (2002b) Toth and Vigo (2003) Prins (2004) Reimann et al. (2004) Li et al. (2005) Pisinger and Ropke (2007) Kytöjoki et al. (2007) Mester and Bräysy (2007)
T-AMP TKV-TA1 TKV-TA2 TV-TS P-EA RDH-AC LGW-RRT PR-ALNS KNBG-VNS MB-AGES SALS-INF SALS-F
Pentium 400 Pentium 400 Pentium 400 Pentium 200 Pentium 1000 Pentium 900 Athlon 1000 Pentium 3000 Athlon64 3000+ Pentium 2800 Pentium 1000 Pentium 1000
45.48 18.4 17.8 17.5 66.6 49.3 N/A 10.8 0.02 24.4 72.5 85.74
0.4 0.4 0.4 0.2 1.0 0.9 N/A 3.0 3.0 2.8 1.0 1.0
18.19 7.36 7.12 3.5 66.6 44.37 N/A 32.4 0.06 68.32 72.5 85.74
6.09 19.88 19.56 28.84 9.23 6.08 5.03 4.97 44.91 0.07 8.07 9.96
4. Conclusions This paper presents a modified local search algorithm, called SALS, to solve the classical VRP. SALS has only one generic parameter which is obtained and updated self-adaptively taking into account the response surface information comes from the problem itself and the performance measure of the algorithm throughout the search process. The most important advantage of SALS is that the algorithm does not need any time and talent to manage parameter optimization. This advantage especially is important in the real-word applications of VRP when the users have neither the time nor the experience to fine-tune the parameters of complicated algorithms. Besides its simplicity, SALS also provides qualified solutions to well-known benchmark problems from the VRP literature within reasonable amount of computation times. Comparison of the proposed algorithm with the recent literature of VRP shows that SALS algorithm has moderate performance in terms of effectiveness and efficiency respect to the other heuristics. However, it should be noted that best solution quality performance of SALS is not far from the solution quality of best performing heuristic from the literature (the difference is 8% in terms of DBS). The users of vehicle routing packages should consider SALS algorithm
as the simplest approach to the VRP in despite of a little deterioration in the solution quality. References Adenso-Diaz, B., & Laguna, M. (2006). Fine-tuning of algorithms using fractional experimental designs and local search. Operations Research, 54(1), 99–114. Battiti, R., & Tecchiolli, G. (1994). The reactive tabu search. ORSA Journal on Computing, 6(2), 126–140. Birattari, M. (2009). Tuning metaheuristics: A machine learning perspective. Berlin: Springer. Christofides, N., & Elion, S. (1969). An algorithm for the vehicle dispatching problem. Operations Research Quarterly, 20, 309–318. Cordeau, J.-F., Gendreau, M., Hertz, A., Laporte, G., & Sormany, J.-S. (2005). New heuristics for the vehicle routing problem. In A. Langevin & D. Riopel (Eds.), Logistics systems: Design and optimization (pp. 279–297). New York: Springer. Dantzig, G. B., & Ramser, J. H. (1959). The truck dispatching problem. Management Science, 6, 80–91. Dongarra, J. J. (2010). Performance of various computers using standard linear equations software. Computer Science Department, University of Tennesse, Knoxville, CS-89-85. Available from: http://www.netlib.org/benchmark/ performance.ps. Dueck, G. (1993). New optimization heuristics: The great deluge algorithm and the record-to-record travel. Journal of Computational Physics, 104, 86–92. Eiben, A. E., Hinterding, R., & Michalewicz, Z. (1999). Parameter control in evolutionary algorithms. IEEE Transaction on Evolutionary Computation, 3(2), 124–141.
8998
C. Alabas-Uslu, B. Dengiz / Expert Systems with Applications 38 (2011) 8990–8998
Golden, B. L., Wasil, E. A., Kelly, J. P., & Chao, I.-M. (1998). The impact of metaheuristics on solving the vehicle routing problem: Algorithms, problem sets, and computational results. In T. G. Crainic & G. Laporte (Eds.), Fleet management and logistics. Boston: Kluwer. Kytöjoki, J., Nuortio, T., Bräysy, O., & Gendreau, M. (2007). An efficient variable neighbourhood search heuristic for very large scale vehicle routing problems. Operations Research and Computers, 34, 2743–2757. Laporte, G., & Semet, F. (2002). Classical heuristics for the capacitated VRP. In P. Toth, D. Vigo (Eds.). The vehicle routing problem, SIAM society for industrial and applied mathematics (pp. 109–128). Li, F., Golden, B., & Wasil, E. (2005). Vey large-scale vehicle routing: New test problems, algorithms, and results. Operations Research and Computers, 32, 1165–1179. Mester, D., & Bräysy, O. (2007). Active guided evolution strategies for large scale capacitated vehicle routing problems. Operations Research and Computers, 34(10), 2964–2975. Pisinger, D., & Ropke, S. (2007). A general heuristic for vehicle routing problems. Operations Research and Computers, 34(8), 2403–2435. Prins, C. (2004). A simple and effective evolutionary algorithm for the vehicle routing problem. Operations Research and Computers, 31, 1985–2002. Reimann, M., Doerner, K., & Hartl, R. F. (2004). D-ants: Saving based ants divide and conquer the vehicle routing problem. Operations Research and Computers, 31(4), 563–591.
Russell, R. A., & Chiang, W.-C. (2006). Scatter search for the vehicle routing problem with time windows. European Journal of Operational Research, 169(2), 606–622. Tarantilis, C. D. (2005). Solving the vehicle routing problem with adaptive memory programming methodology. Operations Research and Computers, 32(9), 2309–2327. Tarantilis, C. D., & Kiranoudis, C. T. (2002). BoneRoute: An adaptive memory-based method for effective fleet management. Annals of Operations Research, 115(1), 227–241. Tarantilis, C. D., Kiranoudis, C. T., & Vassiliadis, V. S. (2002a). A backtracking adaptive threshold accepting metaheuristic method for the vehicle routing problem. System Analysis Model Simulation (SAMS), 42(5), 631–644. Tarantilis, C. D., Kiranoudis, C. T., & Vassiliadis, V. S. (2002b). A list based threshold accepting algorithm for the capacitated vehicle routing problem. Journal of Computational Mathematics, 79(5), 537–553. Tian, P., Ma, J., & Zhang, D-M. (1999). Application of the simulated annealing algorithm to the combinatorial optimization problem with permutation property: An investigation of generation mechanism. European Journal of Operational Research, 118, 81–94. Toth, P., & Vigo, D. (2002). The vehicle routing problem. SIAM monographs on discrete math. and appl., Philadelphia. Toth, P., & Vigo, D. (2003). The granular tabu search (and its application to the vehicle routing problem). INFORMS Journal on Computing, 15(4), 333–348.