A hybrid genetic algorithm and particle swarm optimization for multimodal functions

A hybrid genetic algorithm and particle swarm optimization for multimodal functions

Applied Soft Computing 8 (2008) 849–857 www.elsevier.com/locate/asoc A hybrid genetic algorithm and particle swarm optimization for multimodal functi...

592KB Sizes 9 Downloads 154 Views

Applied Soft Computing 8 (2008) 849–857 www.elsevier.com/locate/asoc

A hybrid genetic algorithm and particle swarm optimization for multimodal functions Yi-Tung Kao a, Erwie Zahara b,* b

a Department of Computer Science and Engineering, Tatung University, Taipei City 104, Taiwan, ROC Department of Industrial Engineering and Management, St. John’s University, Tamsui 251, Taiwan, ROC

Received 1 December 2006; received in revised form 25 June 2007; accepted 1 July 2007 Available online 7 July 2007

Abstract Heuristic optimization provides a robust and efficient approach for solving complex real-world problems. The focus of this research is on a hybrid method combining two heuristic optimization techniques, genetic algorithms (GA) and particle swarm optimization (PSO), for the global optimization of multimodal functions. Denoted as GA-PSO, this hybrid technique incorporates concepts from GA and PSO and creates individuals in a new generation not only by crossover and mutation operations as found in GA but also by mechanisms of PSO. The results of various experimental studies using a suite of 17 multimodal test functions taken from the literature have demonstrated the superiority of the hybrid GAPSO approach over the other four search techniques in terms of solution quality and convergence rates. # 2007 Published by Elsevier B.V. Keywords: Heuristic optimization; Multimodal functions; Genetic algorithms; Particle swarm optimization

1. Introduction In the last two decades there has been a growing interest in evolutionary computing, which has inspired new ways for solving optimization problems. In contrast to traditional optimization methods, which emphasize accurate and exact computation, but may fall down on achieving the global optimum, evolutionary computation provides a more robust and efficient approach for solving complex real-world problems [1,2]. Among existing evolutionary algorithms, the best-known branch is the genetic algorithm (GA). GA is a stochastic search procedure based on the mechanics of natural selection, genetics and evolution [3]. Since this type of algorithm simultaneously evaluates many points in the search space, it is more likely to find the global solution of a given problem. In addition, it uses only a simple scalar performance measure that does not require or use derivative information, so methods classified as GA are easy to use and implement. More recently, based upon the interaction of individual entities called ‘‘particles,’’ Kennedy and Eberhart [4,5] proposed a new heuristic algorithm called ‘‘particle swarm optimization’’

* Corresponding author. E-mail address: [email protected] (E. Zahara). 1568-4946/$ – see front matter # 2007 Published by Elsevier B.V. doi:10.1016/j.asoc.2007.07.002

(denoted as PSO). The development of this algorithm follows from observations of social behaviors of animals, such as bird flocking and fish schooling. The theory of PSO describes a solution process in which each particle flies through the multidimensional search space while the particle’s velocity and position are constantly updated according to the best previous performance of the particle or of the particle’s neighbors, as well as the best performance of the particles in the entire population. Compared with GA, PSO has some attractive characteristics. It has memory, so knowledge of good solutions is retained by all the particles; whereas in GA, previous knowledge of the problem is discarded once the population changes. It has constructive cooperation between particles; that is, particles in the swarm share information among themselves. To date, PSO has been successfully applied to optimizing various continuous nonlinear functions in practice [6]. Hybridization of evolutionary algorithms with local search has been investigated in many studies [7–9]. Such a hybrid is often referred to as a mimetic algorithm. In the case at hand, we will combine two global optimization algorithms, i.e., GA and PSO, as PSO and GA both work with an initial population of solutions and combining the searching abilities of both methods seems to be a reasonable approach. Originally, PSO functions according to knowledge of social interaction, and all individuals are taken into account in each generation. On the

850

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857

contrary, GA simulates evolution and some individuals are selected while some others are eliminated from generation to generation. Taking advantage of the compensatory property of GA and PSO, we propose a new algorithm that combines the evolutionary natures of both (denoted as GA-PSO). The robustness of GA-PSO will be tested against a set of benchmark multimodal test functions collected from Siaary and Berthiau [10] and the results are compared extensively with those obtained by the continuous genetic algorithm, continuous hybrid algorithm, hybrid Nelder–Mead simplex search and particle swarm optimization, and hybrid continuous tabu search and Nelder–Mead simplex algorithm. 2. Genetic algorithms and particle swarm optimization 2.1. Genetic algorithms (GA) The discovery of genetic algorithms (GA) was dated to the 1960s by Holland and further described by Goldberg [3]. GA is a randomized global search technique that solves problems by imitating processes observed from natural evolution. Based on the survival and reproduction of the fittest, GA continually exploits new and better solutions without any pre-assumptions, such as continuity and unimodality. GA has been successfully adopted in many complex optimization problems and shows its merits over traditional optimization methods, especially when the system under study has multiple local optimum solutions. GA evolves a population of candidate solutions. Each solution is usually coded as a binary string called a chromosome. The fitness of each chromosome is then evaluated using a performance function after the chromosome has been decoded. Upon completion of the evaluation, a biased roulette wheel is used to randomly select pairs of better chromosomes to undergo such genetic operations as crossover and mutation that mimic nature. Should the newly produced chromosomes turn out to be stronger than the weaker ones from the previous generation, they will replace these weaker chromosomes. This evolution process continues until the stopping criteria are reached. A real-coded GA uses a vector of floating-point numbers instead of 0’s and 1’s for implementing chromosome encoding. The crossover operator of a real-coded GA is constructed by borrowing the concept of linear combination of vectors from the area of convex set theory. The random mutation operator proposed for real-coded GA operates on the gene by introducing into it a perturbation that is a random number in the range of 0–1 in the feature’s domain. With some modifications of the genetic operators, the real-coded GA has resulted in better performance than the binary coded GA for continuous problems [11].

particles are adjusted according to its previous best position, and the neighborhood best or the global best. Since all particles in PSO are kept as members of the population throughout the course of the searching process, PSO is the only evolutionary algorithm that does not implement survival of the fittest. As simple and economic in concept and computational cost, PSO has been shown to successfully optimize a wide range of continuous optimization problems [12,13]. 3. Hybrid genetic algorithm and particle swarm optimization This section discusses the infrastructure and rationale of the hybrid algorithm. Fig. 1 depicts the schematic representation of the proposed hybrid GA-PSO. As can be seen, GA and PSO both work with the same initial population. When solving an Ndimensional problem, the hybrid approach takes 4N individuals that are randomly generated. These individuals may be regarded as chromosomes in the case of GA, or as particles in the case of PSO. The 4N individuals are sorted by fitness, and the top 2N individuals are fed into the real-coded GA to create 2N new individuals by crossover and mutation operations, as shown in Fig. 1. The crossover operator of the real-coded GA is implemented by borrowing the concept of linear combination of two vectors, which represent two individuals in our algorithm, with a 100% crossover probability. The random mutation operator proposed for the real-coded GA is to modify an individual with a random number in the problem’s domain with a 20% probability. The effect of mutation rate will be discussed in Section 4.1. The new 2N individuals created from real-coded GA are used to adjust the remaining 2N particles by the PSO method. The procedure of adjusting the 2N particles in the PSO method involves selection of the global best particle, selection of the neighborhood best particles, and finally velocity updates. The global best particle of the population is determined according to the sorted fitness values. The neighborhood best particles are selected by first evenly dividing

2.2. Particle swarm optimization (PSO) Particle swarm optimization (PSO) is one of the latest evolutionary optimization techniques developed by Eberhart and Kennedy [4]. PSO concept is based on a metaphor of social interaction such as bird flocking and fish schooling. The particles, which are potential solutions in the PSO algorithm, fly around in the multidimensional search space and the positions of individual

Fig. 1. Schematic representation of the GA-PSO hybrid. (!) Associated with the GA operations. (- - - !) Associated with the PSO operations.

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857

Fig. 2. The GA-PSO hybrid algorithm.

851

852

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857

Table 1 Effect of mutation rate on the solution quality and performance on GA-PSO Test function

Performance

Mutation rate (%) 1

5

10

15

20

25

RC

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00061 2.844 11429

100 0.00149 2.978 12098

100 0.00053 2.908 12734

100 0.00083 2.563 10102

100 0.00009 2.117 8254

100 0.00049 2.933 11298

ES

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00006 0.198 880

100 0.00015 0.178 760

100 0.00008 0.186 787

100 0.00015 0.192 804

100 0.00003 0.195 809

100 0.00006 0.225 918

GP

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.07092 9.400 38127

100 0.00585 6.692 26659

100 0.00097 6.689 26074

100 0.00100 7.485 29108

100 0.00012 6.686 25706

100 0.00051 6.642 25402

B2

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00013 0.037 147

100 0.00003 0.036 140

100 0.00005 0.042 163

100 0.00002 0.047 179

100 0.00001 0.044 174

100 0.00009 0.0468 162

SH

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00017 54.658 215910

100 0.00009 36.892 143154

100 0.00023 18.271 70526

100 0.00021 22.616 86509

100 0.00007 25.984 96211

100 0.00018 27.915 104588

R2

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.42739 5.447 21783

100 0.22121 14.164 55802

100 0.00517 33.744 132191

100 0.00238 32.172 124650

100 0.00064 36.822 140894

100 0.00056 25.219 95950

Z2

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00013 0.027 100

100 0.00030 0.030 110

100 0.00006 0.030 112

100 0.00027 0.025 96

100 0.00005 0.025 95

100 0.00006 0.030 108

DJ

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00003 0.049 169

100 0.00002 0.042 156

100 0.00001 0.047 170

100 0.00002 0.053 185

100 0.00004 0.056 206

100 0.00003 0.055 182

H3,4

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00020 0.497 1914

100 0.00016 0.506 1920

100 0.00015 0.444 1688

100 0.00015 0.555 2032

100 0.00020 0.577 2117

100 0.00024 0.519 1908

S4,5

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00024 161.322 571912

100 0.00025 125.283 440400

100 0.00021 150.151 524484

100 0.00018 126.225 433624

100 0.00014 154.556 529344

100 0.00029 152.067 515792

S4,7

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00018 13.488 41611

100 0.00030 15.372 45451

100 0.00017 12.358 33979

100 0.00027 14.924 43611

100 0.00015 18.119 56825

100 0.00025 18.388 57314

S4,10

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00019 17.973 46030

100 0.00022 13.231 33597

100 0.00024 16.389 41234

100 0.00014 17.245 43158

100 0.00012 17.472 43314

100 0.00019 23.416 57989

R5

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.33370 1122.213 8240526

100 0.00261 835.414 6086084

100 0.00053 393.870 2840790

100 0.00024 289.845 2070242

100 0.00013 213.922 1527953

100 0.00009 173.464 1209662

Z5

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00000 0.078 326

100 0.00001 0.094 358

100 0.00001 0.094 358

100 0.00002 0.103 392

100 0.00000 0.105 398

100 0.00001 0.094 372

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857

853

Table 1 (Continued ) Test function

Performance

Mutation rate (%) 1

5

10

15

20

25

H6,4

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00032 3.139 11256

100 0.00028 3.633 13976

100 0.00028 3.622 11405

100 0.00031 3.583 13627

100 0.00024 3.326 12568

100 0.00024 3.189 11882

R10

Success rate (%) Average error Average time (s) Average func_evaluation

100 1.08786 2471.923 10000040

100 0.00603 2471.923 10000040

100 0.00043 2448.219 9904148

100 0.00010 2027.30 8101236

100 0.00005 1358.753 5319160

100 0.00005 1079.558 4243428

Z10

Success rate (%) Average error Average time (s) Average func_evaluation

100 0.00000 0.184 784

100 0.00000 0.189 764

100 0.00000 0.199 804

100 0.00000 0.211 848

100 0.00000 0.227 872

100 0.00000 0.242 924

the 2N particles into N neighborhoods and designating the particle with the better fitness value in each neighborhood as the neighborhood best particle. By Eqs. (5) and (6), velocity and position updates for each of the 2N particles are then carried out. The result is sorted in preparation for repeating the entire run. The hybrid algorithm, which is described in Fig. 2, terminates when it satisfies a convergence criterion that is based on the standard deviation of the objective function values of N + 1 best individuals of the population. It is defined as follows: X 1=2 N þ1 2 ¯ Sf ¼ ð f ðxi Þ  fÞ =ðN þ 1Þ
(1)

i¼1

P 4 where f¯ ¼ Nþ1 i¼1 f ðxi Þ=ðN þ 1Þ and y = 1  10 . Research regarding the parameter settings in the hybrid GA-PSO algorithm bears further scrutiny.

4. Computational experiments This section focuses on the efficiency of GA-PSO as tested against 17 benchmark functions with 2–10 variables, which are listed in the appendix. To avoid attributing the optimization results to the choice of a particular initial population and to conduct fair comparisons, we perform each test 100 times, starting from various randomly selected points in the hyperrectangular search domain given in the literature. The algorithm is coded in Matlab 7.0 and the simulations are run on a Pentium IV 2.4 GHz with 512 MB memory capacity. To evaluate the algorithm’s efficiency and effectiveness, we have applied the following criteria to the 100 minimizations per test function: the rate of successful minimizations, the average of the function evaluation numbers, the average of computation times, and the average error, which is the average of the errors between the best successful points found and the known global optimum [7].

Table 2 Results of CGA, CHA and GA-PSO for 17 test functions Test function

RC ES GP B2 SH R2 Z2 DJ H3,4 S4,5 S4,7 S4,10 R5 Z5 H6,4 R10 Z10

Success rate (%)

Average of function evaluation numbers

Average error

CGA

CHA

GA-PSO

CGA

CHA

GA-PSO

CGA

CHA

GA-PSO

100 100 100 100 100 100 100 100 100 76 83 81 100 100 100 80 100

100 100 100 100 100 100 100 100 100 85 85 85 100 100 100 83 100

100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100

620 1504 410 430 575 960 620 750 582 610 680 650 3990 1350 970 21563 6991

295 952 259 132 345 459 215 371 492 698 620 650 3290 950 930 14563 4291

8254 809 25706 174 96211 140894 95 206 2117 529344 56825 43314 1358064 398 12568 5319160 872

0.0001 0.0010 0.0010 0.0003 0.0050 0.0040 0.000003 0.0002 0.0050 0.1400 0.1200 0.1500 0.1500 0.0004 0.0400 0.0200 0.000001

0.0001 0.0010 0.0010 0.0000002 0.0050 0.0040 0.000003 0.0002 0.0050 0.0090 0.0100 0.0150 0.0180 0.00006 0.0080 0.0080 0.000001

0.00009 0.00003 0.00012 0.00001 0.00007 0.00064 0.00005 0.00004 0.00020 0.00014 0.00015 0.00012 0.00013 0.00000 0.00024 0.00005 0.00000

854

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857

Table 3 List of various methods used in the comparison Method

References

Hybrid genetic algorithm and particle swarm optimization (GA-PSO) Continuous hybrid algorithm (CHA) Enhanced continuous tabu search (ECTS) Continuous genetic algorithm (CGA) Enhanced simulated annealing (ESA) Continuous reactive tabu search (CRTS minimum) minimum Continuous reactive tabu search (CRTS average) average Tabu search (TS) INTEROPT

This paper [7] [10] [14] [15] [16] [16] [17] [18]

4.1. Effects of the mutation rate One of the questions concerning the proposed method is what mutation rates should be used in each iteration. Mutation rates vary from species to species, and it is 0.01% or less in human genes. However, such a low mutation rate is likely to result in slow convergence rates or even convergence at a local optimum. This is especially true for problems with very complex functions. Since mutation serves to accelerate the process of evolution, it would seem that the higher the rate, the faster the process. We therefore examine the effects of different mutation rates on the

solution quality as well as the computational efforts for acquiring the optimal solution. Six levels of mutation rate (1%, 5%, 10%, 15%, 20%, 25%) are examined and the experimental results are reported in Table 1. The results suggest that a random mutation rate of 10%–25% is sufficient to enhance the performance of GAPSO search and the quality of solutions. One of the general observations is that as the mutation rate increases, the GA-PSO search tends to find solutions closer to the global optimum, as indicated by smaller average errors. With a 20% mutation rate, nine solutions out of 17 have resulted with relatively smaller average errors for test problems RC, ES, GP, B2, SH, Z2, S4,5, S4,7, and S4,10. Test problems R5 and R10 need a higher mutation rate of 25% to get near the global optimum in less time. A 20% mutation rate is adopted by GA-PSO in all subsequent comparisons with other methods. 4.2. Comparison with other methods Continuous genetic algorithm (CGA) [14], continuous hybrid algorithm (CHA) [7], and GA-PSO are applied to the suit of 17 test problems, and the outcome is tabulated in Table 2. It is found that GA-PSO has a 100% rate of successful minimization for all 17 classical test functions. Although GA-PSO requires more time to perform more number of function evaluations, the average error of GA-PSO is smaller than those of CGA and CHA. It is

Table 4 Average of function evaluation numbers used by 9 methods to optimize 10 functions of less than 5 variables Function

GA-PSO

CHA

CGA

ECTS

CRTS minimum

CRTS average

TS

ESA

INTEROPT

RC ES GP R2 Z2 DJ H3,4 S4,5 S4,7 S4,10

7936 845 27973 119244 100 171 1792 471784 (1.0) 37822 (1.0) 42358 (1.0)

295 952 259 459 215 371 492 698 (0.85) 620 (0.85) 635 (0.85)

620 1504 410 960 620 750 582 610 (0.76) 680 (0.83) 650 (0.81)

245 – 231 480 195 – 548 825 (0.75) 910 (0.80) 898 (0.75)

41 – 171 – – – 609 664 871 693

38 – 248 – – – 513 812 960 921

492 – 486 – – – 508 –

– – 783 796 15820 – 698 1137 (0.54) 1223 (0.54) 1189 (0.50)

4172 – 6375 – – – 1113 3700 (0.4) 2426 (0.6) 3463 (0.5)

Table 5 Results provided by NM-PSO, CTSS and GA-PSO for 15 test functions Test function

RC ES GP B2 SH R2 Z2 DJ H3,4 S4,5 S4,7 S4,10 R5 Z5 H6,4

Success rate (%)

Average of function evaluation numbers

Average error

NM-PSO

CTSS

GA-PSO

NM-PSO

CTSS

GA-PSO

NM-PSO

CTSS

GA-PSO

100 100 100 100 100 100 100 100 100 100 100 100 100 100 100

100 100 100 100 100 100 100 100 100 75 77 74 – – –

100 100 100 100 100 100 100 100 100 100 100 100 100 100 100

151 165 217 240 400 339 135 291 271 1177 1130 1179 3308 1394 1541

125 325 119 98 283 369 78 155 225 538 590 555 – – –

8254 809 25706 174 96211 140894 95 206 2117 529344 56825 43314 1358064 398 12568

0.00003 0.00004 0.00003 0.00003 0.00002 0.00003 0.00003 0.00005 0.00024 0.00020 0.00017 0.00015 0.00560 0.00026 0.00250

0.005 0.000005 0.005 0.001 0.001 0.004 0.0000003 0.0002 0.005 0.007 0.001 0.001 – – –

0.00009 0.00003 0.00012 0.00001 0.00007 0.00064 0.00005 0.00004 0.00020 0.00014 0.00015 0.00012 0.00013 0.00000 0.00024

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857

three global minima: (x1,x2)* = (p,12.275), (p,2.275), (9.42478, 2.475); RC((x1,x2)*) = 0.397887.

well worth the extra time needed. In other words, GA-PSO is more effective than the other two methods. The performance of GA-PSO is then compared to nine other published versions of hybrid GA, continuous GA, simulated annealing, and tabu search. These various methods are listed in Table 3. The experimental results obtained for 10 test functions of less than 5 variables, using the 9 different methods, are given in Table 4. We give the average number of function evaluations for 100 runs of each function. It can be seen that some results are not available for some methods. The numbers in parentheses are the ratios of runs for which the algorithm found the global minimum rather than being trapped at a local minimum. Basically, all the nine methods have 100% success rate except on test functions S4,5, S4,7 and S4,10. From Table 4, we notice that the numbers of function evaluations for ES, Z2 and DJ from GA-PSO are less than those of the other methods, meaning that GA-PSO is more effective than the other methods. The number of function evaluations for test functions S4,5, S4,7 and S4,10 are greater than those of the other methods, but the ratio of success is better. In view of the algorithm’s effectiveness and efficiency on the whole, the hybrid GA-PSO approach remains quite competitive against the other methods. The performance of GA-PSO has been further compared with two recently published hybrid algorithms: NM-PSO [19] (hybrid Nelder–Mead simplex search and particle swarm optimization) and CTSS [20] (hybrid continuous tabu search and Nelder–Mead simplex algorithms). Experimental data obtained from 15 test functions are given in Table 5. GA-PSO shows the highest accuracy for 10 test problems, with a 100% rate of successful performance on all examples. We conclude that GA-PSO approach remains quite competitive as compared to NM-PSO and CTSS.

B2 (two variables): B2ðx1 ; x2 Þ ¼ x21 þ 2x22  0:3cos ð3px1 Þ  0:4 cos ð4px2 Þ þ 0:7; search domain: 100 < xj < 100, j = 1,2; several local minima (exact number unspecified in usual literature); one global minimum: (x1,x2)* = (0,0); B2((x1,x2)*) = 0. Easom (ES) (two variables): ESðx1 ; x2 Þ ¼ cos ðx1 Þcos ðx2 Þexpðððx1  pÞ2 þðx2  pÞ2 ÞÞ; search domain: 100 < xj < 100, j = 1,2; several local minima (exact number unspecified in usual literature); one global minimum: (x1,x2)* = (p,p); B2((x1,x2)*) = 1. Goldstein and Price (GP) (two variables): GPðx1 ; x2 Þ ¼ ½1 þ ðx1 þ x2 þ 1Þ2 ð19  14x1 þ 3x21  14x2 þ6x1 x2 þ 3x22 Þ  ½30 þ ð2x1  3x2 Þ2  ð18  32x1 þ 12x21 þ48x2  36x1 x2 þ 27x22 Þg; search domain: 2 < xj < 2, j = 1,2; four local minima; 1 global minimum: (x1,x2)* = (0,1); B2((x1,x2)*) = 3. Shubert (SH)n(two variables): o nP P5 5 jcos ½ð j þ 1Þx þ j  SHðx1 ; x2 Þ ¼ 1 j¼1 j¼1 jcos  ½ð j þ 1Þx2 þ j ; search domain: 10 < xj < 10, j = 1,2; 760 local minima; 18 global minima: SH((x1,x2)*) = 186.7309. De Joung (DJ) (three variables): DJðx1 ; x2 ; x3 Þ ¼ x21 þ x22 þ x23 ; search domain: 5.12 < xj < 5.12, j = 1,2,3; one single minimum (local and global): (x1,x2,x3)* = (0,0,0); DJ((x1,x2,x3)*) = 0.

5. Conclusions In this paper, GA-PSO has been proposed as a new approach for optimizing multimodal functions. GA-PSO integrates the concept of evolving individuals originally modeled by GA with the concept of self-improvement of PSO, where individuals enhance themselves based on social interactions and their private cognition. Thus GA-PSO synthesizes the merits of both GA and PSO, and it is a simple and yet effective model to handle different kinds of continuous optimization problems. Simulated experiments for the optimization of nonlinear multimodal functions show that GA-PSO is superior to the other methods in the ability to finding the global optimum. Appendix A. List of test functions [10] Branin RCOC (RC) (two variables):  2       1  RCðx1 ; x2 Þ ¼ x2  4p5 2 x21 þ p5 x1  6 þ10 1  8p cos ðx1 Þ þ 10; search domain: 5 < x1 < 10, 0 < x2 < 15; no local minimum;

855

Hartmann (H3,4) (three h variables): i P4 P H3;4 ðxÞ ¼  i¼1 ci exp  3j¼1 ai j ðx j  pi j Þ2 ; search domain: 0 < xj < 1, j = 1,2,3; four local minima: pi = ( pi1,pi2,pi3) = ith local minimum approximation; f(pi) ffi ci; one global minimum: x* = (0.11,0.555,0.855); H3,4(x) = 3.86278. i

aij

1 2 3 4

3.0 0.1 3.0 0.1

10.0 10.0 10.0 10.0

30.0 35.0 30.0 35.0

ci

pij

1.0 1.2 3.0 3.2

0.3689 0.4699 0.1091 0.0381

0.1170 0.4387 0.8732 0.5743

0.2673 0.7470 0.5547 0.8827

Shekel (S4,nP ) (four variables): 1 S4;n ðxÞ ¼  ni¼1 ½ðx  ai ÞT ðx  ai Þ þ ci  ; T 1 2 3 4 T x ¼ ðx1 ; x2 ; x3 ; x4 Þ ; ai ¼ ðai ; ai ; ai ; ai Þ ; three functions S4,n were considered: S4,5, S4,7 and S4,10;

856

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857

search domain: 0 < xj < 10; j = 1, . . ., 4; n local minima (n = 5, 7 or 10): aTi ¼ ith local minimum approximation: S4;n ðaTi Þ ffi  1=ci ; S4,5 n = 5 5 minima with 1 global minimum: S4,5(x) = 10.1532 S4,7 n = 7 7 minima with 1 global minimum: S4,7(x) = 10.40294 S4,10 n = 10 10 minima with 1 global minimum: S4,10(x) = 10.53641 i

aTi

1 2 3 4 5 6 7 8 9 10

4.0 1.0 8.0 6.0 3.0 2.0 5.0 8.0 6.0 7.0

two functions were considered: GR8 and GR10; search domain: 300 < xj < 600, j = 1, . . .,n; several local minima; one global minimum: x* = (0, . . ., 0); GRn(x*) = 0 References [1] D.B. Fogel, Evolutionary Computation: Toward a New Philosophy of Machine Intelligence, IEEE Press, Piscataway, NJ, 1995. [2] X. Yao (Ed.), Evolutionary Computation: Theory and Applications, World Scientific, Singapore, 1999. [3] D.E. Goldberg, Genetic Algorithms in Search, Optimization and Machine learning, Addison-Wesley, New York, 1998. [4] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, (1995), pp. 39–43. [5] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proc. IEEE Int’l Conf. Neural Networks, Piscataway, NJ, USA, (1995), pp. 1942– 1948. [6] J. Kennedy, R.C. Eberhart, Y. Shi, Swarm Intelligence, Morgan Kaufmann Publisher, San Francisco, 2001. [7] R. Chelouah, P. Siarry, Genetic and Nelder–Mead algorithms hybridized for a more accurate global optimization of continuous multiminima functions, Eur. J. Operat. Res. 148 (2003) 335–348. [8] S.F. Fan, E. Zahara, Hybrid simplex search and particle swarm optimization for unconstrained optimization problems, Eur. J. Operat. Res. 181 (2007) 527–548. [9] J. Yen, J.C. Liao, B. Lee, D. Randolph, A hybrid approach to modeling metabolic systems using a genetic algorithm and simplex method, IEEE Trans. Syst., Man, Cybernet.-Part B: Cybernet. 28 (1998) 173–191. [10] R. Chelouah, P. Siarry, Tabu search applied to global optimization, Eur. J. Operat. Res. 123 (2000) 30–44, Special issue on combinatorial optimization. [11] C.Z. Janikow, Z. Michalewicz, An experimental comparison of binary and floating point representation in genetic algorithms, in: Proceedings of the

ci 4.0 1.0 8.0 6.0 7.0 9.0 5.0 1.0 2.0 3.6

4.0 1.0 8.0 6.0 3.0 2.0 3.0 8.0 6.0 7.0

4.0 1.0 8.0 6.0 7.0 9.0 3.0 1.0 2.0 3.6

0.1 0.2 0.2 0.4 0.4 0.6 0.3 0.7 0.5 0.5

Hartmann (H6,4) (six hvariables): i P P H6;4 ðxÞ ¼  4i¼1 ci exp  6j¼1 ai j ðx j  pi j Þ2 ; search domain: 0 < xj < 1, j = 1, . . ., 6; four local minima: pi = ( pi1, . . ., pi6) = ith local minimum approximation; f(pi) ffi ci; one global minimum: x* = (0.20169, 0.150011, 0.47687, 0.275332, 0.311652, 0.6573); H6,4(x) = 3.32237.

i

aij

1 2 3 4

10.0 0.05 3.00 17.0

3.00 10.0 3.50 8.00

17.0 17.0 1.70 0.05

3.50 0.10 10.0 10.0

1.70 8.00 17.0 0.10

8.00 14.0 8.00 14.0

Rosenbrock (Rn) (n variables): P 2 2 2 Rn ðxÞ ¼ n1 j¼1 ½100ðx j  x jþ1 Þ þ ðx j  1Þ ; two functions were considered: R2 and R5; search domain: 5 < xj < 10, j = 1, . . ., n; several local minima; one global minimum: x* = (1, . . ., 1); Rn(x*) = 0 Zakharov (Zn) (nvariables): P P 2 P 4 n n n 2 Zn ðxÞ ¼ þ j¼1 x j þ j¼1 0:5 jx j j¼1 0:5 jx j two functions were considered: Z2 and Z5; search domain: 5 < xj < 10, j = 1,  ,n; several local minima; one global minimum: x* = (0, . . ., 0); Zn(x*) = 0 Griewank (GRn) (n variables):  P Q x2i GRn ðxÞ ¼ ni¼1 4000  ni¼1 cos pxiffii þ 1

ci

pij

1.0 1.2 3.0 3.2

0.1312 0.2329 0.2348 0.4047

[12]

[13]

[14]

[15]

[16]

[17]

0.1696 0.4135 0.1451 0.8828

0.5569 0.8307 0.3522 0.8732

0.0124 0.3736 0.2883 0.5743

0.8283 0.1004 0.3047 0.1091

0.5886 0.9991 0.6650 0.0381

International Conference on Genetic Algorithms, San Diego, CA, USA, (1991), pp. 31–36. B. Brandstatter, U. Baumgartner, Particle swarm optimisation— mass-spring system analogon, IEEE Trans. Magn. 38 (2002) 997– 1000. H. Yoshida, K. Kawata, Y. Fukuyama, S. Takayama, Y. Nakanishi, A particle swarm optimization for reactive power and voltage control considering voltage security assessment, IEEE Trans. Power Syst. 15 (2000) 1232–1239. R. Chelouah, P. Siarry, A continuous genetic algorithm designed for the global optimization of multimodal functions, J. Heurist. 6 (2000) 191– 213. P. Siarry, G. Berthiau, F. Durbin, J. Haussy, Enhanced simulated annealing for globally minimizing functions of many continuous variables, ACM Trans. Math. Software 23 (1997) 209–228. R. Battiti, G. Tecchiolli, The continuous reactive tabu search: Blending combinatorial optimization and stochastic search for global optimization, Annal. Operat. Res. 63 (1996) 53–188. D. Cvijovic, J. Klinowski, Taboo search. An approach to the multiple minima problem, Science 667 (1995) 664–666.

Y.-T. Kao, E. Zahara / Applied Soft Computing 8 (2008) 849–857 [18] G.L. Bilbro, W.E. Snyder, Optimization of functions with many minima, IEEE Trans. Syst., Man Cybernet. 21 (1991) 840–849. [19] S.K. Fan, Y.C. Liang, E. Zahara, Hybrid simplex search and particle swarm optimization for the global optimization of multimodal functions, Eng. Optim. 36 (2004) 401–418.

857

[20] R. Chelouah, P. Siarry, A hybrid method combining continuous tabu search and Nelder–Mead simplex algorithms for the global optimization of multiminima functions, Eur. J. Operat. Res. 161 (2005) 636–654.