Applied Soft Computing 13 (2013) 2759–2766
Contents lists available at SciVerse ScienceDirect
Applied Soft Computing journal homepage: www.elsevier.com/locate/asoc
Convergence of nomadic genetic algorithm on benchmark mathematical functions S. Siva Sathya ∗ , M.V. Radhika Department of Computer Science, School of Engineering and Technology, Pondicherry University, Puducherry 605 014, India
a r t i c l e
i n f o
Article history: Received 29 June 2011 Received in revised form 23 August 2012 Accepted 24 November 2012 Available online 17 December 2012 Keywords: Genetic algorithm Nomadic genetic algorithm Rosenbrock function Rastrigin function Ackley function Sphere function Griewank function
a b s t r a c t Nomadic genetic algorithm is a type of multi-population migration based genetic algorithm that gives equal importance to low fit individuals and adaptively chooses its migration parameters. It has been applied to several real life applications and found to perform well compared to other genetic algorithms. This paper exploits the working of nomadic genetic algorithm (NGA) for benchmark mathematical functions and compares it with the standard genetic algorithm. To compare its performance with standard GA (SGA), the prominent mathematical functions used in optimization are used and the results proved that NGA outperforms SGA in terms of convergence speed and better optimized values. © 2012 Elsevier B.V. All rights reserved.
1. Introduction Genetic algorithms [1,2] are a part of evolutionary computing which is a rapidly growing area of artificial intelligence. They are adaptive methods used to solve search and optimization problems. They are based on the genetic processes of biological organisms. By mimicking the principles of natural evolution, i.e. “Survival of the Fittest”, GAs are able to evolve solutions to real world problems. This paper describes both standard genetic algorithm and a variant of the standard genetic algorithm called nomadic genetic algorithm. It is a specialized form of genetic algorithm that works on the principle of “Birds of the same feather flock together”. Generally in standard GA different kinds of selection mechanisms like Roulette wheel selection, rank based selection, tournament selection, etc. are employed based on the type of application. All these selection mechanisms aim to select high fit individuals in different proportion for the purpose of mating. The low fit individuals are given very less chance for mating or they are totally discarded in some selection schemes thus reducing the diversity in the population. But the worst individuals, if given a chance may also result in better offspring in the next generation. This phenomenon is given
∗ Corresponding author. E-mail addresses:
[email protected] (S. Siva Sathya),
[email protected] (M.V. Radhika). 1568-4946/$ – see front matter © 2012 Elsevier B.V. All rights reserved. http://dx.doi.org/10.1016/j.asoc.2012.11.011
importance in this variant of standard GA. Here, the individuals in the population are grouped into different communities or groups, based on their fitness value. Individuals, in a community mate with each other. Here again, different kinds of selection mechanisms could be used within the community. If any offspring comes up with a better fitness, it leaves its community and joins a different community, i.e. the group of similar fitness value. It employs most of the principles of standard GA except that, it allows for migration of individuals within the different communities in the population that the individuals are grouped into. The selection procedure followed in nomadic genetic algorithm insists on mating within the same community thus providing equal chances of mating even to the weakest section of the population. This allows the nomadic genetic algorithm to maintain the diversity of individuals in the population, also ensuring faster convergence. To prove the performance of NGA, the benchmark mathematical functions have been taken up for implementation and testing. The same set of functions are implemented for the SGA and the results are compared in terms of the number of generations required to converge and also the optimized values. 2. Related work Multi-population GAs [3,4] are good at faster convergence, since each sub-population evolves independent of each other. The performance of MGA is heavily affected by a judicious choice of parameters, such as connection topology, migration method,
2760
S. Siva Sathya, M.V. Radhika / Applied Soft Computing 13 (2013) 2759–2766
migration rate, and population number [5,6]. MGAs shorten the number of generations to find optimal or near optimal solutions. It is also more resistant to premature convergence. A number of multi-population genetic algorithms have been reported in literature and some of them are presented in this section. In [26], migration occurs after each generation and a copy of best individual in each sub-population is sent to all its neighbors. Grosso [7] used 5 sub-population and exchanged individuals with fixed migration rate and proved that migration greatly influenced the performance of GA. A multi-population GA based on chaotic migration strategy [22] employs asynchronous migration of individuals during parallel evolution and proved its immunity against premature convergence. A number of mechanisms for restricting the mating of individuals also have been proposed earlier [8]. Generally mating is restricted among similar individuals with the notion that similar parents produce similar offspring, which will not produce diversity in the population. Booker [9] and Goldberg [23] have explored various approaches in which a mating tag is added to each individual. The tag must match before a cross is permitted. Another type of mating restriction is introduced by Spears [10] which adds a one dimensional ring topology and restricts mating to neighbors with identical tags. To maintain the diversity of individuals in a population, migration has also been attempted earlier [27,28], but with parallel GAs like in Genitor II by Whitley [30], wherein individuals migrate from one processor to another. According to Tanese [11,12], genetic algorithms that incorporate migration are reported to produce more population diversity. Even though there are several types of multi-population GAs, they have not addressed the issue of selection bias present in genetic algorithms. Different kinds of selection mechanisms and genetic operators have been employed to guide the random adaptive procedure of GA to explore all possible solutions. Simple GA’s selection mechanism replicates higher fitness solutions and discards lower fitness solutions leading to convergence of the population. For instance, Brindle [31] has proved the inferior performance of Roulette wheel selection on several test functions. Also, Baker [13] has analyzed various fitness proportionate selection methods. The ultimate aim of these selection methods is achieving diversity [25] as it is considered to be an important goal to reach a global optimum solution. Some GAs primarily rely on mutation or mutation like mechanisms for diversification [14]. There is always a trade-off between convergence and diversity in GA. To balance both these aspects, the nomadic genetic algorithm has been proposed which allows beneficial search as well as controlled convergence.
3. Nomadic genetic algorithm In the previous section, a survey of migration based GAs has been presented. Also, the different kinds of selection mechanisms have been discussed. Generally, most of the selections mechanisms are adept at choosing the best individuals for mating and subsequently make to the next generation. The low fit individuals are given very less chance of survival or none at all. This strategy, do allow the GA to converge faster, but the exploration of the entire search is not guaranteed. The importance of low fit individuals in GA had been felt by some researchers, and they have also tried to introduce some kind of mutation or a new individual to the population when it is found that the population has saturated with only high fit individuals and there is no scope for further exploration of the search space. They have resorted to several methodologies to
introduce diversity in the population which is not possible if the high fit individuals are selected always. Apart from the problems associated with the selection bias, the problems encountered with parallel GAs are also worth mentioning at this juncture. The success of any parallel GA relies heavily on the selection of the correct set of migration parameters. The parameters for parallel GA include the migration rate, migration interval and the connection topology. A proper selection of these parameters is crucial to the success of GA and many researchers have attempted several combinations of them. This paper is one such attempt wherein, the design and implementation of nomadic genetic algorithm which falls under the class of multi-population GA is presented and analyzed.
3.1. Nomadic genetic algorithm Nomadic genetic algorithm [15,16] is a form of multi-population GA with migration capabilities, wherein, the shortcomings of selection mechanisms are removed and the virtues of parallel GA with migration has been incorporated. The concept of nomadic genetic algorithm emerged from the fact that “Birds of the same feather flock together”. In the real world scenario, any living being live and mingle within its own community. Only seldom do they go for intermingling with other communities. This is an inherent feature of any species and it is being exploited in this nomadic genetic algorithm.
3.1.1. Methodology Genetic algorithms begins with an initial population of individuals or chromosomes. In nomadic GA, the entire population is organized into several communities or groups, based on their fitness values. The number of groups depends on the population size and the number of groups, initially defined by the user. Moreover, it has been empirically proved that NGA performs well if the entire population is divided into 3 or 4 groups. For instance, if the population size is 100, number of groups is 4, then the entire population is sorted and the first 25 falls in one group, the second 25 in another and so on. This can be compared with the societal status of an individual in the real sense. Individuals, within a community or group mate among themselves to produce their offspring. The general GA procedure is carried out within the groups, maybe selection, crossover or mutation. Once the offspring are produced, they may have different fitness values, based on which they try to migrate to a different community or group which suit their fitness level or status. But, the number of individuals in a group always remains constant throughout the GA execution. In this way, individuals keep migrating from one group to another like nomads or wanderers, depending upon their fitness level, and hence the term nomadic genetic algorithm has been coined for this method. The specialty of this algorithm is that, the migration parameters like, migration rate, migration frequency, topology and migration policy need not be explicitly mentioned by the user. This is adaptive in nature and according to the fitness variations, the number of individuals to migrate (rate) depends on how many individuals have improved their fitness value in the subsequent generation. The question of where to migrate (topology) is again taken care of the algorithm itself. Almost in every generation, migration occurs and hence the migration frequency also need not be mentioned. Similarly, the migration policy (how individuals replace one another) also need not be specified because individuals are not replaced here, but they try to move to a different group. Fig. 1 [29] illustrates the working principle of NGA The algorithm is given in the subsequent section:
S. Siva Sathya, M.V. Radhika / Applied Soft Computing 13 (2013) 2759–2766
Group1
Group2
Selecon
Crossover Mutaon
Offspring
Group 3
Group 4 Fig. 1. Working principle of NGA.
3.1.2. Algorithm Initialization 1. Generate initial population randomly Fitness evaluation 2. Evaluate the fitness of each Individual Grouping and breeding 3. Sort the individuals in non-increasing order of their fitness values 4. The population is then arranged into groups based on the user defined number of groups and their fitness range. a. b. c. d.
Select individuals from each group Apply crossover/mutation operators Evaluate the fitness of offspring Add offspring to the same group
Migration 5. Combine all the groups in to a single list 6. Sort the list in non-increasing order of their fitness values and trim the list to the size of the groups. (The number of groups and group size being fixed) 7. Repeat the process from Step-4 to the required no. of generations 8. Select the best (high fit) individual The problems associated with other migration based GA like determining the communication topology, migration rate and migration frequency are ruled out in NGA. Because individuals migrate to any group if the fitness level improves, no communication topology or migration frequency needs to be explicitly mentioned. Similarly, the number of individuals that migrate (rate of migration) is also dependent on the fitness level variation and need not be mentioned. The migration parameters of NGA are explained below:
2761
3.1.3. Migration parameters of NGA Migration parameters are part and parcel of any MGA and the difference in various MGAs basically stem from the differences observed in these migration parameters. The migration parameters for NGA, takes the following dimensions. • Connection topology: in NGA, the individuals adaptively migrate to any group whose individual fitness value seems similar. Being adaptive in nature, no explicit connection topology needs to be mentioned. • Migration method: in NGA, whether the migration takes place with adjacent sub-population or far-off populations need not be specified. It is adaptive and asynchronous in nature. • Migration interval: in NGA, the migration interval is not specified explicitly, but migration occurs, whenever there is a change in the fitness value of the individuals in the sub-populations. This occurs almost every generation with almost every group. • Migration rate: in NGA, this depends on the number of individuals that have improved their fitness and this is uniform between the groups. Since, the groups sizes are fixed, if one high fit individual from some low fit group enters into a sub-population, one individual from the current group has to leave to some other group based on its fitness level. • Population number: the number of sub-population and the number of individuals in each sub-population can be user specific and will not change during the course of the run. • Migration policy: this is not explicitly stated like the other MGA. It is neither the worst fit nor the best fit that is being targeted for replacement, but in NGA, it is the individual that stands as an odd man out in the sub-population that migrates on its own to some other group. Until the fitness of the individual remains within the fitness level of the group in which it resides, there is no need for a migration or replacement. 3.1.4. Advantages of NGA From the algorithm, it can be observed that NGA solves many problems. Firstly, the overhead of selection is reduced, because there is no bias in the selection procedure. All the individuals in the population are given fair chance to mate, survive and improve, maybe within their own community. It is not always necessary that if two worst fit individuals mate, they will only produce a worst individual. They may also produce some better offspring capable of migrating to a better group. Similarly, some best fit individuals may also migrate to some low fit groups. But, it is observed that this condition occurs very rarely and does not affect the performance of NGA [29]. Secondly, it resembles a parallel GA or multi-population GA in some aspects, because the entire population is divided into groups or communities. This generally helps the GA to converge faster because the communities evolve independently. Also, since all the migration parameters are implicit, the user is totally relieved of the burden of selecting the migration parameters. Thirdly, NGA lends itself easily to parallelism, in the sense that it can be distributed to multiple processors as well. If the population size is very large or if the number of groups are more, the communities can independently evolve at different processors and migration effected between them if required. 4. Benchmark mathematical functions Optimization is referred in computational science as finding the best solution to a given problem or an application. The objective chosen in this paper is a set of mathematical functions which is generally termed as “Benchmark Problems”. These benchmark problems are generally used for evaluating the performance of
2762
S. Siva Sathya, M.V. Radhika / Applied Soft Computing 13 (2013) 2759–2766
Table 1 Comparison of the optimized values for Rosenbrock function.
Table 2 Comparison of the optimized values for Rastrigin function.
Sl. No.
NGA optimized values
SGA optimized values
Sl. No.
NGA optimized values
SGA optimized values
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
0.0002134 0.001245 0.018764 0.0032189 0.003874 0.037534 0.017543 0.0065423 0.018754 0.005432 0.014568 0.0175432 0.021567 0.054874 0.027638 0.0075489 0.0017862 0.000876 0.042134 0.0043215 0.005685 0.0027654 0.0051276 0.054375 0.004732
0.009123 0.049379 0.09055 0.06192 0.06925 0.066817 0.114019 0.00342 0.013527 0.007492 0.186675 0.152111 0.11555 0.123693 0.00484 0.066956 0.146483 7.48E−04 0.089837 0.220804 0.004342 0.098838 0.087896 0.037237 0.164049
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
0.01298 0.002341 0.004231 0.04398 0.007456 0.021349 0.029874 0.001987 0.007432 0.007312 0.018743 0.074321 0.054386 0.074326 0.06549 0.031875 0.056387 0.015324 0.062854 0.038752 0.056438 0.048765 0.067895 0.085423 0.075439
0.16364 0.03052 0.45936 0.080267 0.07014 0.058291 0.351215 0.055085 0.351099 0.523064 0.255688 0.08287 0.328982 0.42012 0.201422 0.17563 0.07284 0.21958 0.048972 0.08647 0.08825 0.042713 0.25969 0.017982 0.054327
Graph 1 Comparison of NGA and SGA for Rosenbrock function.
Graph 2 Comparison of NGA and SGA for Rastrigin function.
Graph 6.2-Comparison of NGA and SGA for Rastrigin funcon
0.25
0.6
0.2
0.5
0.15 0.1
NGA Opmised Values
0.05
SGA Opmised Values
Opmised Values
Opmised Values
Graph 6.1-Comparison of NGA and SGA for Rosenbrock funcon
0.4 NGA Opmised Values
0.3 0.2
SGA Opmised Values
0.1 0
0 1 3 5 7 9 11 13 15 17 19 21 23 25
1
3
5
No of execuons
any evolutionary algorithms. Generally, for function optimization, a large test is considered. When an algorithm is evaluated, most important prerequisite is to look for the problems where the performance is good. This will help to formulate the test set for which the algorithm should be evaluated. The test set designed by Eiben and Back is the most adequate. The test set has several well characterized functions that will allow us to obtain and generalize, as far as possible, the results regarding the kind of function involved. A function is multimodal if it has two or more local optima [17]. A function of p variables is separable if it can be rewritten as a sum of p functions of just one variable. The separability is closely related to the concept of epistasis or interrelation among the variables of the function [17]. Non separable functions are more difficult to optimize as the accurate search direction depends on two or more genes. On the other hand, separable functions can be optimized for each variable in turn. The optimization problem is more difficult if the function is also multimodal. The search process must be able to avoid the regions around local minima in order to approximate, as far as possible, the global optimum. The most complex case appears when the local optima are randomly distributed in the search space. This is most apparent in the case of Rastrigin’s where the local optima is scattered throughout the search space and it is not uncommon for one to get deceived with the local optima for the global optima.
7
9 11 13 15 17 19 21 23 25 No of execuons
The dimensionality of the search space is another important factor in the complexity of the problem. A study of the dimensionality problem and its features was carried out by Friedman (Reference) Following are the function set taken up for experimentation. • Rosenbrock function Rosenbrock function, or De Jong’s function F2, is a two dimensional function with a deep valley with the shape of a parabola that leads to the global minimum It is defined by: 2
f (x, y) = (1 − x)2 + 100(y − x2 ) . The function has several local optima and the global minimum is at (1, 1). Due to the non-linearity of the valley, many algorithms converge slowly because they change the direction of the search repeatedly [18]. Function definition and associated criteria are as follows:
fRos(x) =
P−1 i=1
xi ∈ [−2.048, 2.048] x∗ = (1, 1, . . . , 1) fRos (X ∗ ) = 0
2
[100(xi+1 − xi2 ) + (xi − 1)2 ]
S. Siva Sathya, M.V. Radhika / Applied Soft Computing 13 (2013) 2759–2766 Table 3 Comparison of the optimized values for Ackley function.
2763
Table 4 Comparison of the optimized values for sphere function.
Sl. No.
NGA
SGA
Sl. No.
NGA
SGA
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
0.015645 0.011237 0.007898 0.043279 0.001894 0.004113 0.036674 0.012983 0.024219 0.003156 0.005923 0.002295 0.004567 0.009543 0.012235 0.003189 0.013459 0.014256 0.005692 0.007682 0.017523 0.009842 0.004275 0.005678 0.006323
0.179913 0.04752 0.150971 0.068327 0.383972 0.019086 0.098257 0.247994 0.078915 0.327587 0.175041 0.032309 0.181213 0.058138 0.194601 0.126796 0.151465 0.035048 0.273321 0.041724 0.058182 0.36952 0.131497 0.231534 0.309947
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
0.018765 0.007563 0.003215 0.056893 0.007256 0.012853 0.023212 0.056532 0.003278 0.034856 0.016543 0.039843 0.058732 0.017843 0.062145 0.042165 0.037589 0.00658 0.043221 0.006265 0.048765 0.007526 0.016543 0.021345 0.047654
0.082735 0.067021 0.086711 0.075335 0.055219 0.17331 0.17258 0.111225 0.077437 0.075423 0.096698 0.079321 0.062159 0.088516 0.14532 0.08169 0.051837 0.094121 0.059279 0.081686 0.085696 0.069885 0.083148 0.047981 0.094795
Graph 3 Comparison of NGA and SGA for Ackley function.
Graph 4 Comparison of NGA and SGA for Sphere function.
Graph 6.4-Comparison of NGA and SGA for Sphere funcon
Graph 6.3-Comparison of NGA and SGA for Ackley funcon 0.2
0.4 0.3 0.2
NGA
0.1
SGA
0 1
3
5
7
9
11 13 15 17 19 21 23 25 No of execuons
Opmised Values
Opmised Values
0.5
0.15 0.1 NGA 0.05
SGA
0 1
3
5
7
9
11 13 15 17 19 21 23
No of execuons
It is not multimodal or separable. • Rastrigin function Rastrigin function was constructed from Sphere adding a modulator term ˛.cos(2˘xi ) Its contour is made up of a large number of local minima whose value increases with the distance to the global minimum [19]. The function definition is as follows:
fRos (X) = 10p +
p
xi2 − 10 cos(2˘xi ))
wider region will be able to cross the valley among the optima and achieve better results [20]. The function definition is as follows:
1 − 20 exp −0.2 f (x) = 0.8
+20 + exp(1) + 5.7 + e
X2
n
− exp
1 n
cos(2˘xi ))
i=1
xi ∈ [−5.12, 5.12]
xi ∈ [−30, 30]
x∗ = (0, 0, . . . , 0)
x∗ = (0, 0, . . . , 0)
fRos (x∗ ) = 0
fAck (x∗ ) = 0
It is both multimodal and separable. • Ackley function Ackley, originally proposed by Ackley and generalized by Bäck, has an exponential term that covers its surface with numerous local minima. The complexity of this function is moderated. An algorithm that only uses the gradient steepest descent will be trapped in local optima, but any search strategy that analyzes a
The function is multimodal but not separable. • Griewank function Griewank function has a product term that introduces interdependence among the variables. The aim is the failure of the techniques that optimize each variable independently. As in
2764
S. Siva Sathya, M.V. Radhika / Applied Soft Computing 13 (2013) 2759–2766
Table 5 Comparison of the optimized values for Griewank function.
Table 7 Comparison of NGA and SGA for the Rastrigin function.
Sl. No.
NGA
SGA
Sl. No.
SGA generation
SGA fitness
NGA generation
NGA fitness
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
1.17E−04 2.23E−04 1.09E−05 4.42E−05 5.76E−04 2.15E−05 4.70E−04 2.13E−05 3.67E−05 4.92E−04 5.76E−04 9.99E−04 9.18E−04 3.22E−04 2.10E−05 1.99E−04 5.18E−05 4.78E−05 6.79E−04 2.98E−05 8.78E−04 1.04E−04 2.97E−04 3.84E−05 7.84E−04
0.00101 5.46E−04 0.0011718 1.68E−04 3.57E−04 6.33E−04 2.39E−03 4.01E−04 7.50E−04 9.39E−04 9.81E−04 5.77E−03 8.37E−04 2.04E−03 1.67E−03 3.47E−04 6.35E−04 5.61E−03 9.56E−06 4.86E−04 6.14E−04 3.15E−03 5.44E−04 8.85E−03 1.16E−03
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
63 51 55 62 51 51 67 51 51 76 78 51 53 52 56 51 53 51 51 69 60 54 51 51 58
0.067251 0.95558 0.06085 0.2231 0.11675 0.029576 0.014334 0.2231 0.17461 0.026086 0.08339 0.26908 0.13605 2.02 1.0476 0.15257 0.18555 0.2645 1.0027 0.054716 0.19677 0.80545 1.1232 0.39594 0.3219
43 36 40 47 36 36 37 36 36 61 63 36 38 37 41 36 38 36 36 54 45 39 36 36 43
0.034721 0.78881 0.050199 0.046122 0.11312 0.01575 0.00146 0.01244 0.15761 0.022196 0.03389 0.006908 0.01061 0.10243 0.05986 0.17187 0.12743 0.13712 0.01346 0.00123 0.18687 0.05154 0.01241 0.03987 0.13217
Graph 5 Comparison of NGA and SGA for Griewank function.
Opmised Values
Graph 6.4-Comparison of NGA and SGA for Griewank funcon
Ackley function, the optima of Griewank function are regularly distributed [24].
8.00E-03 6.00E-03 4.00E-03 NGA 2.00E-03
SGA
0.00E+00 1
3
5
7
9
fGri (x) = 1 +
p xi2 i=1
4000
−
p
cos
x
i=1
√i i
xi ∈ [−600, 600] x∗ = (0, 0, . . . , 0)
11 13 15 17 19 21
No of execuons
fGri (x∗ ) = 0 The function is multimodal but not separable. • Sphere function
Table 6 Comparison of NGA and SGA for the Rosenbrock function.
Table 8 Comparison of NGA and SGA for the Ackley function.
Sl.no
SGAgen
SGA fitness
NGAgen
NGA fitness
Sl.No.
SGA generation
SGA fitness
NGA generation
NGA fitness
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
61 51 57 52 51 51 51 51 51 61 51 51 51 51 51 51 54 51 52 52 52 51 51 51 51
0.012154 0.0626 0.017503 0.015139 0.021632 0.049762 0.019929 0.000824 0.080972 0.001117 0.16512 0.000819 0.078724 0.048264 0.026715 0.001849 3.08E−05 8.17E−05 0.000263 0.000419 0.000532 4.67E−05 0.000256 0.001107 0.00053
40 31 37 42 32 31 31 31 31 31 42 31 31 31 31 31 31 35 31 33 33 33 31 31 31
0.013454 0.00536 0.019603 0.017539 0.021632 0.039762 0.001893 0.000724 0.091972 0.002217 0.017651 0.000772 0.008872 0.005826 0.036715 0.000205 4.18E−05 7.17E−05 0.000364 0.000429 0.000632 3.77E−05 0.000376 0.000109 0.00093
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
63 57 52 51 51 51 51 51 51 51 56 51 54 60 52 51 51 51 51 51 51 51 65 51 57
0.012586 0.08262 0.073956 0.051445 0.031541 0.064012 0.045087 0.025622 0.093632 0.12207 0.023611 0.006259 0.021099 0.009219 0.007707 0.007965 0.011134 0.020981 0.017188 0.007464 0.008192 0.011691 0.006084 0.016775 0.007995
48 42 37 36 36 36 36 36 36 36 41 36 39 45 37 36 36 36 36 36 36 36 50 36 42
0.022896 0.09372 0.084999 0.04451 0.033451 0.075112 0.0555197 0.016632 0.084733 0.11407 0.033755 0.007389 0.0011379 0.008326 0.001125 0.007713 0.011134 0.011551 0.022233 0.003465 0.001192 0.022731 0.005132 0.024675 0.008757
S. Siva Sathya, M.V. Radhika / Applied Soft Computing 13 (2013) 2759–2766 Table 9 Comparison of NGA and SGA for the Sphere function. Sl.No.
SGA generation
SGA fitness
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
138 192 99 144 113 170 138 154 189 144 103 163 135 132 143 198 91 169 172 138 180 220 140 107 114
0.02845 0.19262 0.84323 0.30603 0.70097 0.12463 0.045245 0.28913 0.26467 0.28845 0.51775 0.26931 0.2931 0.35836 0.40954 0.22764 0.30771 0.22824 0.31156 0.24818 0.16889 0.73056 0.16256 0.63189 0.48877
NGA generation 98 101 99 95 95 103 99 110 120 112 93 113 102 103 123 78 111 110 103 118 120 121 92 99
The function is not multimodal but it is separable [21]. NGA fitness 0.029551 0.28372 0.44323 0.00603 0.10017 0.11363 0.11245 0.11923 0.44117 0.19914 0.11223 0.12312 0.54211 0.11412 0.11124 0.22112 0.00771 0.011524 0.11156 0.01818 0.04289 0.03155 0.26236 0.11819 0.08877
Sphere function has been used in the development of the theory of evolutionary strategies, and in the evaluation of genetic algorithms as part of the test set proposed by De Jong. Sphere, or De Jong’s function [21], is a simple and strongly convex function.
fSph (x) =
p
2765
xi2
i=1
xi ∈ [−5.12, 5.12] x∗ = (0, 0, . . . , 0) fSph (x∗ ) = 0
Table 10 Comparison of NGA and SGA for the Griewank function. Sl. No.
SGA generation
SGA fitness
NGA generation
NGA fitness
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
51 51 51 51 53 52 55 54 53 58 51 57 55 58 53 55 56 51 53 51 53 51 55 51 51
1.060E−05 2.22E−05 1.31E−06 2.09E−05 1.90E−05 1.56E−05 3.91E−05 1.47E−05 1.99E−05 9.58E−05 1.46E−05 2.98E−06 3.50E−05 3.40E−06 1.48E−05 6.45E−05 4.14E−05 2.64E−03 1.25E−05 1.93E−05 1.18E−04 1.25E−06 4.46E−05 1.20E−05 1.61E−08
39 35 36 38 43 40 37 34 38 37 36 35 34 39 35 38 37 37 35 36 38 37 36 35 37
1.70E−05 2.32E−05 1.21E−06 2.10E−05 1.80E−05 1.66E−05 3.81E−05 1.49E−05 1.99E−05 9.78E−05 1.48E−05 2.99E−06 3.55E−05 3.50E−06 1.49E−05 6.46E−05 4.16E−05 2.65E−03 1.26E−03 1.95E−05 1.19E−04 1.31E−06 4.24E−05 1.99E−08 1.21E−05
5. Experimental setup The chromosome representation that is used is double vector. The GA parameter settings are as follows: No. of chromosomes Elitism (%) Mutation (%) Crossover Selection operator No of groups in NGA
: : : : : :
100 2 5 Single point crossover Tournament 4 (4 × 25)
The platforms used for the simulations are MATLAB and JAVA. Matlab is used for simulation of standard genetic algorithm and java for nomadic genetic algorithm. Each mathematical function was subjected to 25 continuous executions in both SGA and NGA and the minimum value and the number of generations taken for convergence was noted. The experiment was conducted for both constant and variable number of generations subjected to the stopping criteria. 6. Comparison of NGA and SGA for benchmark problems The experiments were carried out for continuous 25 executions. For each table the corresponding graphs are shown in the underlying figures, which indicate the performance of NGA over SGA in terms of fitness values and the number of generations to converge (Tables 1–10). The same set of functions was implemented for variable number of generations. The optimized values and the no. of generations it took in a continuous 25 iterations were recorded and the corresponding plots for SGA and NGA is obtained. The plots below gives an information on the number of generations for convergence and the optimized value got in each iteration. On implementing the above mentioned mathematical functions, it is quite clear that NGA converged to a better minimized value than the SGA with minimum number of generations. Taking into consideration different functions due to its independent nature behaved differently in terms of the number of generations and also the function values. The test set was chosen in such a way that the efficiency of the algorithm is well depicted after the experiment. The plot of Rosenbrock function shows that NGA values are more clustered near the global optimum than the SGA. Moreover, there are high peaks and low valleys in the plot for SGA which NGA did not exhibit. The Rastrigin function which belongs to the same genre exhibited the same characteristics. NGA was able to converge to its optimum in a much more efficient way for this particular mathematical function. Ackley function showed almost a smooth regular plot for NGA, being at its best among the best. Sphere function gave a much more optimized value in NGA than in SGA. The sphere function on implementing in SGA did not converge to the global optima beyond a certain value. But on a relative basis, it can be stated clearly that the NGA outperformed SGA. Griewank function converged efficiently for SGA, but it exhibited a random variation in few iterations. But NGA uniformly converged to a better optimized value with almost a uniform number of generations in each iteration. Except one peak, the plot almost showed a convergence behavior to the global optimum efficiently. 7. Conclusion The results clearly prove that the migration based nomadic genetic algorithm performs very well when compared to standard GA in terms of minimizing the objective function value and number of generations to converge. The minimization of other mathematical functions which come under benchmark problems is currently
2766
S. Siva Sathya, M.V. Radhika / Applied Soft Computing 13 (2013) 2759–2766
under study. The NGA offers a well synchronized method for solution to any theoretical and practical problems with the virtue of giving importance to low fit individuals. References [1] D.E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning, ninth ed., Pearson Education, 2005. [2] J.H. Holland, Adaptation in Natural and Artificial Systems, MIT Press, Cambridge, MA, 1992. [3] K. Kojima, M. Ishigame, G. Chakraborthy, H. Hatsuo, S. Makino, Asynchronous parallel distributed genetic algorithm with elite migration, International Journal of Computational Intelligence 4 (2) (2008) 105–111. [4] M. Munetomo, Y. Takai, Y. Sato, An efficient migration scheme for subpopulation based asynchronously parallel genetic algorithms, in: Proceedings of the Fifth International Conference on Genetic Algorithms, 1993, p. 649. [5] W.-Y. Lin, T.-P. Hong, On adapting migration parameters for multi-population genetic algorithms, in: IEEE International Conference on Systems, Man and Cybernetics, 2004, pp. 5731–5735. [6] E. Cantu-Paz, Migration policies and take over times in parallel genetic algorithms, in: Proceedings of Genetic and Evolutionary Computation Conference (GECCO’99), 1999, pp. 775–785. [7] P.B. Grosso, Computer simulations of genetic adaptation: parallel subcomponent interaction in a multilocus model. Doctoral Dissertation, The University of Michigan, 1985 (University Microfilms No. 8520908). [8] M. Gorges-Schleuter, Comparison of local mating strategies in massively parallel genetic algorithms, in: R. Manner, B. Manderick (Eds.), Parallel Problem Solving from Nature, vol. 2, Elsevier, Amsterdam, 1992, pp. 553–562. [9] L.B. Booker, Intelligent behavior as an adaptation to the task environment. Doctoral Dissertation, University of Michigan, 1982. [10] W.M. Spears, Simple subpopulation schemes, in: Proceedings of the Third Annual Conference on Evolutionary Programming, 1994, pp. 296–307. [11] R. Tanese, Distributed genetic algorithm, in: Proceedings of the Third International Conference on Genetic Algorithms, 1989, pp. 434–439. [12] R. Tanese, Parallel genetic algorithm for a hypercube, genetic algorithms and their applications, in: Proceedings of the Second International Conference on Genetic Algorithms, 1989, pp. 177–183. [13] J.E. Baker, Reducing bias and inefficiency in the selection algorithm. Genetic algorithm and their applications, in: Proceedings of the second International Conference on Genetic Algorithms, 1987, pp. 14–21. [14] S.W. Mahfoud, Niching methods for genetic algorithms. Ph.D. Thesis, University of Illinois, Urbana, Champagne, 1995.
[15] S. Siva Sathya, S. Kuppuswami, K. Rajashekar, Nomadic genetic algorithm for course time tabling problem, in: Proceedings of the International Conference on Science Technology and Management (CISTM 07), Hyderabad, India, 2007. [16] S. Siva Sathya, S. Kuppuswami, K. Syam Babu, Nomadic genetic algorithm for multiple sequence alignment, International Journal of Adaptive and Innovative Systems 1 (1) (2009) 44–59. [17] T. Back, Evolutionary Algorithms in Theory and Practice, Oxford University Press, 1996. [18] H.H. Rosenbrock, An automatic method for finding the greatest or least value of a function, The Computer Journal 3 (1960) 175–184. [19] L.A. Rastrigin, Extremal Control Systems. Theoretical Foundations of Engineering Cybernetics Series, Nauka, Russian, Moscow, 1974. [20] D. Ackley, An empirical study of bit vector function optimization, Genetic Algorithms and Simulated Annealing (1987) 170–215. [21] K.D. De Jong, An analysis of the behavior of a class of genetic adaptive systems. Ph.D. Thesis, Departament of Computer and Communication Sciences, University of Michigan, Ann Arbor, 1975. [22] X. Chen, W. Gui, L. Chen, A multipopulation GA based on chaotic migration strategy and its applications to inventory programing, in: Proceedings of the 5th World Congress on Intelligent Control and Automation, 2004, pp. 2159–2162. [23] D.E. Goldberg, K. Deb, A comparative analysis of selection schemes used in genetic algorithms, in: G.J.E. Rawlins (Ed.), Foundations of Genetic Algorithms, Morgan Kaufmann, Los Altos, 1991, pp. 69–93. [24] A.O. Griewank, Generalized descent for global optimization, Journal of Optimization Theory and Applications 34 (1981) 11–39. [25] C.K. Oei, D.E. Goldberg, S.J. Chang, Tournament Selection, Niching and the Preservation of Diversity. IlliGAL Report, University of Illinois, Illinois Genetic Algorithms Laboratory, 1991. [26] C.B. Pettey, M.R. Leuze, J.J. Grefenstette, A parallel genetic algorithm, in: Proceedings of the Second International Conference on Genetic Algorithms, 1987, pp. 155–161. [27] D. Power, C. Ryan, M.A. Azad, Promoting diversity using migration strategies in distributed genetic algorithms, in: Proceedings of the IEEE Congress on Evolutionary Computation, vol. 1(2–5), 2005, pp. 1831–1838. [28] M. Rebaudengo, M.S. Reorda, An experimental analysis of the effects of migration in parallel genetic algorithms, in: Proceedings of Euromicro Workshop on Parallel and Distributed Processing, 1993, pp. 232–238. [29] S. Siva Sathya, S. Kuppuswami, Analyzing the migration effects in nomadic genetic algorithm, International Journal of Adaptive and Innovative Systems 1 (2) (2010) 158–170. [30] D. Whitley, J. Keith, Genitor: a different genetic algorithm, in: Proceedings of the Rocky Mountain Conference on Artificial Intelligence, 1988, pp. p.118–130. [31] A. Brindle, Genetic Algorithms for Function Optimization. Doctoral Disseration and Technical Report TR81-2, Edmonton, University of Alberta, Canada, 1981.