0 – 1 Quadratic Knapsack Problem solved with VNS algorithm

0 – 1 Quadratic Knapsack Problem solved with VNS algorithm

Available online at www.sciencedirect.com Electronic Notes in Discrete Mathematics 47 (2015) 269–276 www.elsevier.com/locate/endm 0 − 1 Quadratic Kn...

215KB Sizes 2 Downloads 61 Views

Available online at www.sciencedirect.com

Electronic Notes in Discrete Mathematics 47 (2015) 269–276 www.elsevier.com/locate/endm

0 − 1 Quadratic Knapsack Problem solved with VNS algorithm Said Toumi 1 Mohamed Cheikh 2 Bassem Jarboui 3 MODILS Lab, FSEGS, Sfax University, Sfax, Tunisia

Abstract In this paper we propose two variants of the variable neighborhood search heuristics to solve the (0 − 1) Quadratic Knapsack Problem. The mixed variable neighborhood descent (Mixed-VND) and the Skewed general variable neighborhood search (SGVNS). The proposed algorithms were tested over a set of large-sized instances with 1000 and 2000 binary variables and the results were compared to those reported from the literature. Encouraging results are obtained. Keywords: Variable neighborhood search, 0 − 1 quadratic knapsack problem, MIXED-VND, SGVNS.

1

Introduction

The binary Quadratic Knapsack Problem was introduced for the first time by Gallo et al.([2]). In general the problem can be formulated as follows: 1 2 3

Email: [email protected] Email: [email protected] Email: bassem [email protected]

http://dx.doi.org/10.1016/j.endm.2014.11.035 1571-0653/© 2014 Elsevier B.V. All rights reserved.

270

S. Toumi et al. / Electronic Notes in Discrete Mathematics 47 (2015) 269–276

max st

n i i=1

n

j=1 Pij yi yj

j=1 wj yj

≤C

yj ∈ {0, 1} , j = 1, . . . , n Where Pij is a symmetric profit matrix, wj (j = 1, . . . , n) is the weight of item j and C is the capacity of knapsack. yj is a binary decision variable taking a value 1 if item j is selected and 0 otherwise. The Quadratic Knapsack Problem calls for selecting an item subset whose overall weight does not surpass a given knapsack capacity C, so as to maximize the overall profit. Many studies have been performed using exact and heuristic methods for solving this problem. The first branch and bound algorithm was developed in [2]. Later, many works using a branch and bound and exact algorithms were developed [1,7]. This problem is known to be NP-hard and the exact methods are only suitable for small-sized instances. Therefore, to solve large-sized instances, different heuristics were proposed in litterature such as: linearization and exchange heuristic [4], mini-swarm approach [8], GRASP and tabu search algorithm [9]. In this paper we propose a mixed variable neighborhood descent [6] method using sequential variable neighborhood descent [5] nested with one neighborhood structure and a Skewed general variable neighborhood search that mix a stochastic approach, when performing K random moves in the neighborhood to ensure the diversification of the search space, and a deterministic approach using a sequential variable neighborhood descent to ensure the intensification of the search space. A comparative study was performed, on large-sized instances, with the previous works in the literature. Based on experimentation, our proposed approach provides better results than the recent works. The outline of this paper is as follows. The next section provides the main aspects of the proposed algorithms. In the third section, computational results are reported, analyzed and compared to state-of-the-art heuristic [9]. This is followed by the conclusions.

2

Variable neighborhood search to solve the QKP

The variable neighborhood search (VNS) [3], based on local search methods, uses a set of neighborhood structures and browse through the distant neighborhoods of the current solution looking for a better solution and to leave the local optima if reached. In this paper, at first, a Mixed variable neighborhood descent (Mixed-VND) combining (Nested-VND) and (Seq-VND) are

S. Toumi et al. / Electronic Notes in Discrete Mathematics 47 (2015) 269–276

271

proposed. Next, we propose a Skewed general variable neighborhood search (SGVNS) that uses a sequential variable neighborhood descent (Seq-VND) as a local search. More details are provided in the following sections.

2.1 Solution representation and initialization In the proposed algorithms, a feasible solution y is represented by a binary vector, wherein (yi = 1) if the ith item is taken in the knapsack, 0 if not. To construct a good initial solution we use a greedy function (Algorithm 1) similar to proposed  that  by [9]. We select at first, the item i maximising the profit n Pii + j=1 Pij per unit of its weight (wi ). Next, item is selected, when its weight does not exceed the remaining capacity of the knapsack and it has the maximum profit per unit of weight considering the already selected items. Algorithm 1 Initial solution 2: 3:

Input: The set of items O = {1, . . . , n}, Set of items inside knapsack I = {i ∈ O|yi = 1}, Weight of items inside knapsack WI Initialization: O  ← O, I ← ,   WI ← 0 n /w ; i ← argmax P ij i j=1

4: 5: 6: 7:

yi ← 1; WI ← WI + wi ; O  ← O  \ {i }; I ← I ∪ {i }; while (∃i ∈ O  |W I + wi ≤ C) do     Pii + j∈I Pij /wi |WI + wi ≤ C ; i ← argmax

1:

8: 9: 10: 11:

i∈O 

i∈O 

yi ← 1; WI ← WI + wi ; O  ← O  \ {i }; I ← I ∪ {i }; end while return I;

2.2 Variable neighborhood descent The variable neighborhood descent (VND)([6,5,3]) is a deterministic variant of VNS. Our proposed VND includes two neighborhood structures looking for the best set of items to put in the knapsack. The first neighborhood structure N1 means swapping the situation (i.e. exists or not in the knapsack) of two items. We switch an item i from the knapsack by an external item j. The second neighborhood structure N2 corresponds to a movement of insertion or removal. We randomly select an item i and we perform (yi ← 1 − yi ), i.e. if the selected item is not in the knapsack, it will be put in, otherwise it will be removed.

272

S. Toumi et al. / Electronic Notes in Discrete Mathematics 47 (2015) 269–276

2.2.1 Seq-VND In the sequential VND, the local search is performed by considering the used neighborhood structures sequentially in accordance to their order. The algorithm goes from one neighborhood to the next if there is no improvement in the solution. As soon as an improvement is found, the algorithm returns to the first neighborhood in the list. If the last neighborhood provides no improvement, the algorithm stops. Steps of the proposed Seq-VND method are given in Algorithm 2. Algorithm 2 Seq-VND 1: Input: The set of neighborhood Nh , h = 1, 2 2: Initialization: Find an initial solution y; h ← 1 3: while h ≤ 2 do 4: if y is not a local optimum for the neighborhood structure Nh 5: y  ← first solution in Nh (y) such that f (y  ) > f (y); 6: y ← y ; 7: h ← 1; 8: else 9: h ← h + 1; 10: end if 11: end while 12: return y;

then

2.2.2 Nested-VND In the pure nested strategy ([5]), a cardinality of visited solution is much larger than in Seq-VND. In fact, if we consider h = 2 neighborhood structure, a Nested-VND applies a local search using the first neighborhood N1 in any point of the second N2, which involves the exploration ofa number of solu  h=2 tions ≤ h=2 |N (y)| , while in the Seq-VND we explore ≤ |N (y)| h h h=1 h=1 solutions. 2.2.3 Mixed-VND In the proposed Mixed-VND a Seq-VND will be applied in each point y  of the second neighborhood structure N2 of the current solution y(y  ∈ N2 (y)). The steps of the proposed Mixed-VND method are given in Algorithm 3. 2.2.4 SGVNS In this section we present the proposed Skewed general variable neighborhood search (SGVNS) algorithms. The main steps of this algorithm are Initialization, Shaking, Seq-VND as local search and Evaluation. These steps are detailed in Algorithm 4.

S. Toumi et al. / Electronic Notes in Discrete Mathematics 47 (2015) 269–276

273

Algorithm 3 Mixed-VND 1: Input: The set of neighborhood Nh , h = 1, 2 2: Initialization: Find an initial solution y; Let L ← N2 (y) 3: repeat 4: Select y  ∈ L; 5: h ← 1; 6: while h ≤ 2 do 7: if y  is not a local optimum for the neighborhood structure Nh 8: y  ← first solution in Nh (y  ) such that f (y  ) > f (y  ); 9: y  ← y  ; 10: h ← 1; 11: else 12: h ← h + 1; 13: end if 14: if f (y  ) > f (y) then 15: y ← y ; 16: L ← N2 (y); 17: else 18: L ← L − {y  }; 19: end if 20: end while 21: until L = ; 22: return y;

then

Algorithm 4 SGVNS 1: Input: The set of neighborhood N2h , h = 1, . . . , hmax = 5; 2: Initialization: Find an initial solution y; ybest ← y; 3: repeat 4: h ← 1; 5: while h ≤ hmax do 6: y  ← a random neighbor in N2h (y); 7: y  ← Seq-VND(y  ); 8: if f (y  ) > f (ybest ) then 9: ybest ← y  ; 10: end if 11: if f (y  ) > (1 − α · dist (y, y  )) · f (y) then 12: y ← y  ; 13: h ← 1; 14: else 15: h ← h + 1; 16: end if 17: end while 18: until A termination condition is met 19: return ybest ;

First, we initialize the algorithm. Then we apply a shaking that performs h random moves in order to generate a solution that will be used as a basis to the Seq-VND presented above. Finally, the evaluation step will tell us about possible improvements and if we may move or not. 2.2.5 Shaking The shaking step helps diversify the search. In our case, it generates a random solution y  by performing h moves in the second neighborhood of solution y

274

S. Toumi et al. / Electronic Notes in Discrete Mathematics 47 (2015) 269–276

   y ∈ N2h (y) . i.e. it performs (yi ← 1 − yi ) which removes or inserts h items in the knapsack. 2.2.6 Evaluation In this step, SGVNS uses a function dist (y, y ) to measure the distance between y  resulted from Seq-VND and the incumbent solution y.  |yi −y | dist (y, y ) = ni=1 n i As soon as f (y ) > (1 − α · dist (y, y )) · f (y), (α > 0), the local search continues by updating (h ← 1).

3

Computational results

Our SGVNS and Mixed-VND algorithms were tested using Large-sized instances with 1000 and 2000 binary variables (these instances are available from [9]). The associated algorithms where coded in C++ and executed on a dell laptop [email protected],3Go RAM. A comparative study is carried out with the results found by the algorithm (Grasp + T abu)r proposed in [9]. When comparing the results we will refer to the following performance measures proposed in [9]: (SR) success ratio, the number of experiments when the best objective value of algorithm achieves (Best) in 100 trials; (RP D) relative percentage deviation, the average gap ((Best − best objective value of algorithm )/ Best × 100) in percentage for 100 trials; (CP U(s)) the average computational time for one experiment. (Best) used in calculating SR and RP D is the best result among those found by (Grasp + T abu)r , Mixed-VND or SGVNS algorithms in 100 trials. The parameters setting in SGVNS is (hmax = 5, α = 1/30). From the Table (1) we can see that each one of the algorithms MixedVND, (Grasp + T abu)r and SGVNS fails to reach (Best) respectively in 22, 23 and 5 instances. The SGVNS algorithm provides better or equal objective solution values in comparison with the (Grasp + T abu)r in all instances, and better values than Mixed-VND in 22 instances. The Mixed-VND algorithm provides better results than (Grasp + T abu)r in 19 instances and less good in 11 instances, and better values than SGVNS in 5 instances. Based on the (RPD) performance measure, SGVNS and (Grasp + T abu)r algorithms provide better results than Mixed-VND. The average computational time is 63.35 in CPU seconds for Mixed-VND, 107.05 for SGVNS and 288.39 for Mixed-VND. Finally, based on these results we can conclude that SGVNS algorithm outperforms the Mixed-VND and (Grasp + T abu)r algorithms.

S. Toumi et al. / Electronic Notes in Discrete Mathematics 47 (2015) 269–276

275

Table 1 Experimental results on large-sized instances. (Grasp + T abu)r SR RPD CPU(s) 1000 25 1 6172407 100 0 29.41 1000 25 2 229941 66 8.30E-03 32.98 1000 25 3 172418 100 0 21.66 1000 25 4 367426 100 0 26.11 1000 25 5 4885611 100 0 37.69 1000 25 6 15689 100 0 8.18 1000 25 7 4945810 100 0 36.51 1000 25 8 1710198 100 0 71.21 1000 25 9 496315 100 0 30.03 1000 25 10 1173792 100 0 58.93 1000 50 1 5663590 100 0 50.74 1000 50 2 180831 100 0 1.44 1000 50 3 11384283 100 0 31.86 1000 50 4 322226 100 0 22.06 1000 50 5 9984247 86 1.50E-04 40.83 1000 50 6 4106261 100 0 58.08 1000 50 7 10498370 84 1.30E-04 33.43 1000 50 8 4981146 20 1.20E-02 116.29 1000 50 9 1727861 100 0 52.77 1000 50 10 2340724 94 2.30E-04 95.28 1000 75 1 11570056 65 7.60E-05 64 1000 75 2 1901389 100 0 32.47 1000 75 3 2096485 100 0 39.86 1000 75 4 7305321 100 0 55.09 1000 75 5 13970842 4.31E-03 37.39 1000 75 6 12288738 100 0 33.44 1000 75 7 1095837 100 0 23.16 1000 75 8 5575813 100 0 68.47 1000 75 9 695774 100 0 22.68 1000 75 10 2507677 100 0 47.32 1000 100 1 6243494 100 0 72.01 1000 100 2 4854086 61 1.30E-03 84.84 1000 100 3 3172022 100 0 47.06 1000 100 4 754727 100 0 23.63 1000 100 5 18647736 5.98E-03 39.15 1000 100 6 16020232 1.20E-02 41.58 1000 100 7 12936205 100 0 44.5 1000 100 8 6927738 100 0 96.05 1000 100 9 3874959 100 0 52.28 1000 100 10 1334494 100 0 23.63 2000 25 1 5268188 100 0 516.57 2000 25 2 13294030 100 0 330.73 2000 25 3 5500433 56 4.45E-04 800.13 2000 25 4 14625118 100 0 346.89 2000 25 5 5975751 100 0 738.33 2000 25 6 4491691 100 0 474.6 2000 25 7 6388756 100 0 558.21 2000 25 8 11769873 100 0 446.95 2000 25 9 10960328 100 0 449.81 2000 25 10 139236 100 0 109.79 2000 50 1 7070736 39 2.70E-02 474.32 2000 50 2 12587545 100 0 534.87 2000 50 3 27268632 1.08E-03 308.88 2000 50 4 17755416 5.53E-03 782.66 2000 50 5 16806112 3.70E-03 1490.22 2000 50 6 23076348 1.30E-03 460.09 2000 50 7 28760992 4.29E-03 714.18 2000 50 8 1580242 100 0 165.18 2000 50 9 26524420 2.37E-03 342.12 2000 50 10 24747888 3.39E-03 408.39 2000 75 1 25123188 5.37E-03 807.05 2000 75 2 12664670 89 4.70E-04 510.05 2000 75 3 43942624 2.89E-03 276.39 2000 75 4 37497160 1.46E-03 354.13 2000 75 5 24835452 2.03E-03 684.33 2000 75 6 45138432 1.49E-03 306.47 2000 75 7 25502694 3.37E-04 490.14 2000 75 8 10067892 100 0 344.83 2000 75 9 14177079 3.60E-02 532.06 2000 75 10 7815755 78 8.70E-05 325.22 2000 100 1 37931456 4.08E-03 435.71 2000 100 2 33667504 6.00E-02 791.51 2000 100 3 29952660 2.14E-03 1489.29 2000 100 4 26950588 4.89E-03 710.79 2000 100 5 22042306 2.68E-03 752.02 2000 100 6 18869620 3.88E-03 548.19 2000 100 7 15850597 100 0 578.18 2000 100 8 13628967 100 0 374.07 2000 100 9 8394562 97 2.20E-04 304.31 2000 100 10 4923559 92 1.40E-04 200.05 (−) Indicate that the algorithm does not achieves (best) ndl

4

Best

SR 100 72 91 100 12 84 41 11 75 83 12 74 100 32 100 100 100 100 100 100 100 100 100 10 21 100 100 100 42 100 72 62 40 100 100 21 35 34 12 23 32 11 21 56 100 23 1 100 1 1 10 1 100 76 1 22 33 100 -

Mixed − V ND RPD 0 8.56E-03 3.94E-03 0 2.77E-03 2.10E-01 3.39E-04 6.13E-03 3.75E-03 5.11E-05 1.88E-03 4.90E-02 1.33E-04 0 6.44E-04 0 9.43E-04 0 0 2.30E-02 4.06E-05 0 0 0 0 0 0 1.70E-02 3.00E-02 0 0 0 3.08E-03 0 6.79E-03 3.98E-04 2.59E-04 5.80E-04 0 0 9.33E-04 2.97E-03 1.01E-03 1.69E-04 1.17E-03 3.29E-03 8.17E-04 2.74E-03 1.02E-03 2.00E-02 0 2.49E-03 7.35E-04 6.26E-03 4.21E-04 1.53E-03 9.14E-04 0 1.12E-03 1.62E-03 3.03E-03 8.79E-03 6.72E-03 1.87E-03 4.85E-04 1.79E-03 3.62E-04 3.80E-03 0 8.71E-04 4.99E-03 3.78E-03 7.99E-04 2.78E-03 3.12E-03 3.02E-03 1.51E-05 1.92E-03 0 0.025

CPU(s) 4.336 1.466 1.28 1.81 15.194 0.249 202.035 3.806 1.529 4.155 4.867 0.624 8.827 1.009 11.173 5.413 8.359 3.888 4.424 2.507 4.471 2.644 1.984 6.913 4.48 6.04 1.37 3.183 1.025 2.4 3.678 3.108 2.134 0.87 9.417 5.783 4.562 22.004 2.334 1.456 122.965 182.475 36.005 47.502 56.065 85.684 40.17 98.854 112.17 3.167 20.997 34.756 189.41 41.855 63.336 210.889 215.608 8.487 163.208 214.469 364.945 22.041 52.77 642.778 44.04 693.104 106.829 18.985 23.866 18.693 41.637 615.057 249.008 178.526 391.192 211.739 39.109 35.522 16.583 10.39

SR 100 100 100 100 11 100 35 65 100 100 100 100 100 100 23 100 23 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 1 100 100 100 100 100 34 12 76 34 87 56 88 12 56 100 100 77 1 1 1 1 100 1 100 1 1 1 1 100 100 53 1 1 1 1 1 100 100 100 11

SGV NS RPD 0 0 0 0 3.12E-03 0 8.03E-04 1.61E-03 0 0 0 0 0 0 6.97E-04 0 6.40E-04 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 4.62E-03 0 0 0 0 0 4.63E-04 1.96E-03 3.02E-04 1.91E-05 1.14E-04 5.69E-04 5.32E-05 1.55E-03 1.97E-04 0 0 2.78E-04 6.87E-04 3.78E-03 3.09E-04 1.24E-03 7.13E-04 0 1.65E-03 9.08E-04 1.89E-04 0 5.58E-03 1.55E-03 4.16E-04 2.56E-03 2.82E-04 0 0 1.63E-03 3.54E-03 2.23E-03 7.93E-04 8.86E-04 1.49E-03 1.08E-03 0 0 0 0.012

CPU(s) 4.337 1.56 0.858 1.467 74.738 0.265 4.633 11.203 1.482 2.2 4.04 0.952 4.165 0.963 38.938 4.82 32.573 4.389 2.396 5.335 4.15 1.968 1.997 4.258 4.305 6.521 1.341 3.198 0.951 2.481 3.026 4.961 2.636 0.889 504.049 3.557 6.818 13.57 2.356 1.404 70.996 478.725 64.163 176.138 29.312 53.976 30.077 161.601 138.52 5.772 23.369 27.409 122.131 707.997 241.458 294.809 258.701 9.345 190.497 128.439 330.511 29.141 77.74 648.552 96.689 757.156 122.058 21.326 23.323 45.692 876.012 409.067 238.385 263.875 285.793 273.125 25.941 22.776 21.138 6.298

Conclusion

In this paper Mixed-VND and SGVNS using two neighborhood structures and Seq-VND as local search, are proposed to solve the (0 − 1) QKP. The exper-

276

S. Toumi et al. / Electronic Notes in Discrete Mathematics 47 (2015) 269–276

imental results in large-sized instances shows that SGVNS algorithm ouperforms the Mixed-VND and the state-of-the-art heuristic (Grasp + T abu)r in terms of quality of results and the time spent to find them. For future work, the proposed SGVNS algorithm may be adapted for other variants of the quadratic knapsack problem.

References [1] Chaillou P., P. Hansen and Y. Mahieu, Best network flow bound for the quadratic knapsack problem, Combinatorial Optimization of Lecture Notes in Mathematics 1403 (1986), pp. 225–235. [2] Gallo, G., P. Hammer and B. Simeone, Quadratic knapsack problem, Mathematical Programming Study 12 (1980), pp. 132–149. [3] Hansen P. and N. Mladenovi´c, Variable neighborhood search: Principles and applications, European journal of Operational Research 130 (2001), pp. 449– 467. [4] Hammer P. L. and D. J. Rader, Efficient methods for solving quadratic 0-1 knapsack problems, Information Systems and Operational Research 35 (1997), pp. 170–182. [5] Ili´c A., D. Uroˇsevi´c, J. Brimberg and N. Mladenovi´c, A general variable neighborhood search for solving the uncapacitated single allocation p-hub median problem, European journal of Operational Research 206 (2010), pp. 289–300. [6] Jarboui B., H. Derbel, S. Hanafi and N. Mladenovi´c, Variable neighborhood search for location routing, Computers & Operations Research 40 (2013), pp. 47–57. [7] Michelon P. and L. Veuilleux, Lagrangean methods for the 0-1 quadratic knapsack problem, European Journal of Operational Research 92 (1996), pp. 326–341. [8] Xie X.-F. and J. Liu, A Mini-swarm for the quadratic knapsack problem, in Proc. of the IEEE Swarm Intelligence Symposium (2007), pp. 190–197. [9] Yang Z., G. Wang and F. Chu, An effective GRASP and tabu search for the 0-1 quadratic knapsack problem, Computers & Operations Research 40 (2013), pp. 1176–1185.