Available online at www.sciencedirect.com Available online at www.sciencedirect.com
d d
Available online at www.sciencedirect.com
ScienceDirect
Procedia Computer Science 00 (2018) 000–000 Procedia Computer Science 00 (2018) 000–000
Procedia Computer Science 126 (2018) 146–155
www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia
22nd International Conference on Knowledge-Based and 22ndIntelligent International Conference on Knowledge-Based Information & Engineering Systems and Intelligent Information & Engineering Systems
Evolutionary Evolutionary Optimization Optimization Based Based on on Biological Biological Evolution Evolution in in Plants Plants Neeraj Guptaa,∗ , Mahdi Khosravyb , Nilesh Patela , Ishwar Sethia Neeraj Guptaa,∗, Mahdia Khosravyb , Nilesh Patela , Ishwar Sethia b Federal b Federal
Oakland University, MI, USA MI, USA UniversityUniversity, of Juiz de Fora, MG, Brazil. University of Juiz de Fora, MG, Brazil.
a Oakland
Abstract Abstract This paper presents a binary coded evolutionary computational method inspired from the evolution in plant genetics. It introduces This paper presents a binary inspired from the evolutionininplant plantgenetics genetics.through It introduces the concept of artificial DNAcoded whichevolutionary is an abstractcomputational idea inspired method from inheritance of characteristics transthe concept of artificial DNA which is an abstract idea inspired from inheritance of characteristics in plant genetics through transmitting dominant and recessive genes and Epimutaiton. It involves a rehabilitation process which similar to plant biology provides mitting dominant and recessive genes and Epimutaiton. It involves a rehabilitation process which similar to plant biology provides further evolving mechanism against environmental mutation for being better and better. Test of the effectiveness, consistency, and further evolving against environmental mutation for being abetter andofbetter. Testbenchmark of the effectiveness, consistency, and efficiency of the mechanism proposed optimizer have been demonstrated through variety complex test functions. Simulation efficiency of the proposed optimizer have been demonstrated through a variety of complex benchmark test functions. Simulation results and associated analysis of the proposed optimizer in comparison to Self-learning particle swarm optimization (SLPSO), results associated analysis (SFLA), of the proposed optimizer in Genetic comparison to Self-learning swarmsearch optimization (SLPSO), Shuffledand Frog Leap Algorithm Multi-species hybrid Algorithm (MSGA), particle Gravitational algorithm (GSA), Shuffled Frog Optimization Leap Algorithm (SFLA), Multi-species hybrid Genetic Bee Algorithm (MSGA), search algorithm (GSA), Group Search (GSO), Cuckoo Search (CS), Probabilistic Algorithm (PBA),Gravitational and Hybrid Differential PSO (HDSO) Group Search Optimization (GSO), Cuckoo Search (CS), Probabilistic Bee Algorithm (PBA), and Hybrid Differential PSO (HDSO) approve its applicability in solving complex problems. In this paper, we have shown effective results on thirty variables benchmark approve its applicability solving complex problems. In this paper, we have shown effective results on thirty variables benchmark test problems of differentinclasses. test problems of different classes. c 2018 2018 The The Authors. Authors. Published Published by by Elsevier Elsevier Ltd. Ltd. © c 2018an The Authors. Published by Elsevier Ltd. This This is is an open open access access article article under under the the CC CC BY-NC-ND BY-NC-ND license license (https://creativecommons.org/licenses/by-nc-nd/4.0/) (https://creativecommons.org/licenses/by-nc-nd/4.0/) This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Selection and peer-review under responsibility of KES International. Selection and peer-review under responsibility of KES International. Keywords: Evolutionary theory; plant genetics; binary coded optimizer; multi species; artificial DNA. Keywords: Evolutionary theory; plant genetics; binary coded optimizer; multi species; artificial DNA.
1. Introduction 1. Introduction Optimization plays an important role in achieving accuracy and increasing efficiency of systems. Researchers from Optimization important rolemeta-heuristic in achieving accuracy andmultiple increasing efficiency of systems. Researchers from a variety of fieldsplays havean proposed many single and population-based Evolutionary Algorithms a(EAs) variety of fields have proposed many meta-heuristic single and multiple population-based Evolutionary Algorithms [1] under the classification of real and binary coded systems, i.e., Genetic Algorithm (GA) [2], Simulated (EAs) [1] under the Memetic classification of real and binary coded systems, i.e.,Search Genetic Algorithm (GA) [2], Simulated Annealing (SA) [3], Optimization Algorithm (MOA) [4], Tabu (TS) [5], Ant Colony Optimization Annealing (SA) [3], Memetic Optimization Algorithm (MOA) [4], Tabu Search (TS) [5], Ant Colony Optimization (ACO) [6], PSO [7] etc. In population-based EAs, the biologically inspired operations have been successfully and (ACO) [6], applied, PSO [7]and etc.due In population-based EAs, the biologically inspired operations been [8]. successfully effectively to their success, meta-heuristic search methods are widely have accepted However,and all effectively applied, and due to their success, meta-heuristic search methods are widely accepted [8]. However, all ∗ ∗
Corresponding author. Tel.: +1-248-759-0414. Corresponding Tel.: +1-248-759-0414. E-mail address:author.
[email protected] E-mail address:
[email protected]
c 2018 The Authors. Published by Elsevier Ltd. 1877-0509 c 2018 1877-0509 Thearticle Authors. Published by Elsevier Ltd. This is an open access under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) This is an and openpeer-review access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Selection under responsibility of KES International. Selection under responsibility International. 1877-0509and © peer-review 2018 The Authors. Published of byKES Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Selection and peer-review under responsibility of KES International. 10.1016/j.procs.2018.07.218
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
147
Meta-heuristics EAs do not work well on all class of functions and show limitations on some class of functions [9], e.g. multi-modularity of the function. The first use of the population-based optimizers, i.e., GA, appeared in the work of John Holland [10], who later suggested population-based binary search concept. He also presented the phenomena of adaptation mechanism and natural selection which has been improved and followed by many others. During the nineties, a lot of innovative methods with their variants were suggested including Particle Swarm Optimization (PSO) variants [7, 11] and the variants of Differential Evolution (DE) [12]. The computational power of the evolutionary algorithms draw the attention of researchers in the last two decades as leaded to invention of Cultural algorithm (CA) [13], Harmony search (HS) [14], Artificial bee colony (ABC) [15], Fish swarm optimization (FSO) [16], Shuffled Frog Leaping Algorithm (SFLA) [17], Biogeography-Based Optimization [18], Cuckoo search (CS) [19], Bat algorithm (BA) [20], Group Search Optimization (GSO) [21], and etc.. Details of these algorithms can be found in survey papers and recent books [1, 2, 8, 9]. Further enhancement of the computational power of EAs [8, 11] has been being a challenging target by the recent nature-inspired evolutionary advances. Also, the survey papers and recent books comprehensively collect the information about their binary variants [1, 2, 8, 11]. After studying many books and papers on the development of evolutionary algorithms, a question raises: Why yet another algorithm design is required? After applying optimization algorithms on different types of functions, researchers came across the “Free Lunch Theorem” [9]. According to which, if one algorithm performs best on a class of problems, it pays the cost for other. As a result, to find an intelligent algorithm from many years is a topic of interest. Many books and papers have published in criticism and supporting these type of algorithms. According to them, the binarily coded algorithms are necessary, where it can be quickly realized as a hardware due to the friendly nature of binary states, and many electronics and control systems can be designed by handling the discrete and binary nature variables. Having this motivation, we focus here on designing a new binary coded evolutionary optimizer based on Mendel’s theory of evolution by inspiration from plant biological breeding where the nature optimizes the heredity characteristics through transmitting recessive characteristics as dominating one. Following the Mendelian theory, the proposed optimizer is a binary coded multi-species optimization algorithm which deploys five operations: flipping, pollination, breeding, discriminating and epimutation [24, 25, 26, 27] for the evolution. Proposed algorithm is evaluated by testing on different classes of benchmark test functions as presented in CEC [29, 30, 31] in comparison to the recent optimizers, i.e., Self-learning PSO (SLPSO) [11], SFLA, Multi-Species Hybrid Genetic Algorithm (MSGA), Gravitational Search Algorithm (GSA), GSO, CS, Probabilistic Bee Algorithm (PBA), and Hybrid Differential Swarm Optimization (HDSO) [32]. These algorithms are population-based techniques and use the benefits of swarm intelligence, memetic information exchange, evolution in genetics, speciation, and the law of physics techniques. MSGA uses hybridization of three different selection schemes (roulette wheel, tournament, and uniform) on the random basis in each evolution epoch. In a similar fashion, it hybridizes three crossover techniques, i.e., single-point, two-point, and uniform. HDSO utilized the hybrid combination of bijective differential PSO and surjective differential PSO, where the details can be found in Ref. [32] . The other optimizers have been used in their original form as described in [11, 17, 22, 21, 19]. The proposed optimizer has the following distinctive features: i) As a binarily coded optimizer, it investigates the transformed genome search space instead of real. ii) The Proposed algorithm exchanges information inherent in the genetic characteristics between different species. iii) Its Operation via double strand artificial DNA provides the appropriate genetic representation of the biological organism. iv) It deploys the natural breeding process of fusing opposite stand chromosomes of DNA, where for the production of F1 generation offspring, the antisense (AS) strand of one parent DNA from a species fertilizes with the sense (S) strand of the mating parent DNA from the other species, or vice-versa. v) The proposed optimizer utilizes two sequential breedings as cross-pollination and self-pollination in an evolution phase instead of a single crossover, which is usual in plant genetics. vi) Through the use of memory in the proposed optimizer, best genes are preserved by the biological microorganism which is used to evolve the species characteristics as recessive gene transfer in descendants. vii) It internally posses a self-organizing process via Epimutation which is the combination of mutation and rehabilitation. It resists the inappropriate changes in plats and provides a better evolution. The structure of the paper is organized as follows. Section II presents the modeling of biologically inspired operators based on biological inspiration and proposed evolutionary algorithm. Section III demonstrates the experimentation and simulation results. Section IV ranks the optimizers based on their performance. Section V offers the future research scope. Finally, the conclusion is derived based on the comparative results with other optimizers.
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
148
(a)
(b)
Fig. 1: (a)Anti-parallel DNA structure and formation of offspring DNA. (b)The epimutation process.
2. Description of the proposed biologically inspired operators for evolutionary algorithm Proposed evolutionary optimizer (PEO) works on chromosomes represented by a sequence of binary bits. Decoded value of it refers to the genetic contribution to the phenotype. The certain pattern of binary bits in the chromosome string encompasses a specific information. The PEO deploys formation of multiple species by deploying the operations mimicked from plant biology: Flipping, Pollination, Breeding, Discrimination and epimutation [24, 25, 26, 27].
2.1. Flipper operator The flipper operation is based upon the double helix structure of DNA as shown in Fig. 1a, where both sense strand ´ strand of DNA are arranged in the opposite direction [24]. Each nth individual plant in the (3´ - 5´ ) and anti-sense (5´ - 3) th i species is represented by a double-strand structured artificial DNA (i,n ), composed of the sense chromosome ri,n T and the anti-sense chromosome vi,n as defined i,n = ri,n , vi,n . i,n is represented by a double row binary matrix with l number of bits. Thus, there are two subpopulations for each species, one is made of sense chromosomes ri,n and the other resembles associated anti-sense chromosomes vi,n . It would be worth to note that evolutionary operations are carrying on the former subpopulation, where later is achieved by flipper operation. Thus, subpopulation contains ri,n is treated as the real population where the later is virtual. Flipper operation presents natural process of forming (5´ - 3´ ´ strand by reversing the order of bits of ri,n chromosome in the real subpopulation and is defined ) strand from (3´ - 5) as, where, l is the location of bit in the chromosome and lend represents last bit location. F (ri,n ) =⇒ vi,n [l] = ri,n [lend − l + 1] 2.2. Pollination operator The pollination operation simulates the pollination behavior of plants wherein plants from different species are selected to fertilize. This is a way in which to produce offspring DNA, the sense strand and the antisense strand are selected from different parents DNAs. In the proposed evolutionary algorithm, two type of pollinations is used in the sequence as named here random and sequential. Random pollination randomly picks a pair of species from the available even numbers of species as [S j , S k ] = {S j : j ← rand(1, n), S k : k ← rand(1, n), where j k}. As a result of the sequential pollination, the individual DNAs from the selected species [ j,n , k,n ] are taken from up to down one by one in a sequence for cross-fertilization. The breeding process to produce offspring follows the natural fertilization between opposite strand chromosomes of different DNAs.
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
149
2.3. Breeding operator Mendel indicates in his evolutionary theory that in two successive generations offspring F1 and F2 , are produced by cross-breeding of different species and self-breeding where plant breeds with itself, respectively. Breeding operation captures the phenomenon of transferring the dominance and the recessive genes in the first (F1 ) and the second (F2 ) generations, respectively as observed in plant breeding. To produce the offspring of F1 generation, r j,n ∈ j,n parent and vk,n ∈ k,n parent is considered as breeding partners from S j and S k species. After breeding, resulting sensechromosomes r Fj,n1 belongs to the F1 generation offspring for j th species, while the required anti-sense strands vFj,1n of the same offspring is obtained by applying flipper operation as described above. According to Mendel Theory, generation offspring F1 are composed of dominant genes either from the parent of one species or other. It follows a stochastic approach to select the dominant genes from two parents as below. r j,n [l] γ ≥ 0.5 F1 r j,n [l] = v [l] otherwise, k,n
Here γ is a uniformly distributed random number between 0 to 1. Thus, the resulting F1 generation offspring DNA is F1 F1 T F1 j,n = [r j,n , F (r j,n )] . Similarly, the self-breeding operation fertilizes F1 generation offspring with itself to produce sense-strand r Fj,n2 of the F2 generation offspring. In this process, genes in r Fj,n1 at locus l is replaced by its recessive genes preserved from their ancestors chromosome a j,n ∈ IP, where IP is an individual pseudo-best solution. Recessive genes from ancestors dominate based on Mendelian probability ρ. Thus, the resulting F2 generation offspring DNA is F2 F2 T F2 j,n = [r j,n , F (r j,n )] . 1 rF2 j,n [l] = r j,n [l] ←− a j,n [l]
2.3.1. Discrimination operator Two consecutive generations of the offspring complete one evolution, where new species are expected to be with better fitness individuals than before. From the produced Mendelian offspring in F1 and F2 generations, all offspring are compared based on their fitness values, then, the topnotch candidates are excluded from being parents for the next evolution. Then, the excluded parents are shuffled to preserve the diversity. The discriminator operator D is described as follow: D(S i ) = Shuffle(Select(Sort(S i ))), where S i is a species set of DNAs, and function “Select(Sort())” picks the first best n individuals which are shuffled by the “Shuffle” function to remove biasing of the best candidate. Thereafter, shuffling of excluded parents preserves the diversity. 2.4. Epimutation operator In biology, Epimutation is known as the evolution of the organism followed by the cycles of self-organizing sequential behavior of nature in mutation and rehabilitations [27]. It plays an import role in unfavorable environmental situations to enable an organism to adapt itself again and again to the condition.Several attempts of this mechanism over a life cycle of an organism leads to an evolved chromosome before going to the next evolution process. According to Joseph Heitman, Epimutation is a reversible process and gives the organism more flexibility[28]. This concept leads to the rehabilitation, where an organism self-improves, maintains, restores its physical strength, cognition, and mobility. This process is shown in Fig. 1b, where plant self-organizes itself for the multiple environmental effects defined by the Epimutation number ξ. Due to this procedure, recessive characteristics of plants, which are represented by Current Pseudo Best Solution (CPBS) until the present evolution so far, undergo the mutation process multiple times over the plant life cycle. If plant finds a better mutation by evolving its inherent chromosomes properties, adapts it, otherwise returns to the previous state as a rehabilitation. The associated epimutation probability should be very low which is very important for “fine tuning” of the pseudo best solution. 2.5. Proposed evolutionary optimization algorithm Pseudo-code for the proposed evolutionary algorithm is given in Algorithm 1, which concludes the solution, once, one of the following termination criteria is met: i) Maximum number of iterations , ii) Average of the species is not changing in iter number of iterations, iii) The same answer is coming in each successive generation for m times, iv) If the error reaches below 10−5 or a desired value.
150
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
Algorithm 1 Pseudo-code for the proposed evolutionary algorithm Require: M is the number of species, Nv , Nb , N, iter, CostFunction, ξ. 1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24: 25: 26: 27: 28: 29: 30: 31: 32: 33: 34: 35:
for j ← 1 to M do for n ← 1 to N do ( j).r(n): initialize subpopulation of sense strands randomly ( j).v(n)←−F (( j).r(n)): apply flipper operation to initialize subpopulation of anti-sense strands R ( j).rR (n)←−( j).r(n): initialize recessive genes associated with sens strand as best characteristics R ( j).vR (n)←−( j).v(n): initialize recessive genes associated with anti-sens strand as best characteristics IP(n)←−best of [R ( j).r(n), R ( j).v(n)]: IP belongs to individual pseudo-best solution end for SP( j)←−best of [IP]: SP belongs to species pseudo-best solution end for GP(1)←−best of [SP]: GP belongs to the global pseudo-best solution (GPBS) while until termination criteria met do i←− 2; for j ← 1 to M do k←−select second species randomly for cross-pollination for n ← 1 to N do F1 ( j): produce F1-Offspring (Multiple offspring could be generated) F2 ( j): produce F2-Offspring (Multiple offspring could be generated) S ←−[F1 ( j), F2 ( j)]: select best offspring to go in next evolution [( j).r(n), ( j).v(n)]←−D(S ): replace old values based on individual fitness if R ( j).F rR (n) < ( j).F r (n) then R ( j).rR (n) ←−( j).r(n): replace previous recessive chromosomes belongs to sense strand end if if R ( j).FvR (n) < ( j).Fv (n) then R ( j).vR (n)←− ( j).v(n): replace previous recessive chromosomes belongs to anti-sense strand end if R ( j)←−Epimutation on R ( j) based on ξ and ρ) IP(n)←−[best of (R ( j).r(n), R ( j).v(n))] end for SP( j)←−best of [IP] end for GP(i)←−best of [SP] if Fitness(GP(i)) < Fitness(GP(i-1)) then GP(i)←−GP(i − 1) end if i←−i + 1 end while return GP
3. Experimental Evaluation: Benchmarking proposed evolutionary optimization Experimental evaluation is carried out for the purpose of benchmarking the proposed evolutionary computational algorithm by utilizing a number of complex continuous/discrete, analytical/non-analytical, separable/non-separable benchmark test functions from CEC’2008 [29, 30, 31]. Averaged simulation results of Monte Carlo runs on these functions helped us to limit the parameters of the proposed optimizer for the better performance as their details and corresponding domains are listed in literature [29, 30, 31]. A list of selected benchmark test functions for experimentation are given in Table - 1. According to the literature, many local minima functions are difficult to be solved in more than ten dimensions, where available algorithms show some limitation, as it is one of the motives to propose new evolutionary algorithm. Here, the evaluation is by benchmarking test problems of thirty variables, and the proposed method has been compared to the eight prominent different classes of evolutionary algorithms, i.e., SLPSO, SFLA, MSGA, GSA, GSO, CS, PBA, and HDSO. Each optimizer has a total 100 individuals/particles/chromosomes/DNAs, which equally splits to the species and carries out for 500 evolution epochs. On each benchmark function, all algo-
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
151
Table 1: Benchmark functions to evaluate the efficiency of the proposed optimizer.
Class/type
Benchmark functions
Multi-modal
Wavy function (WF), Schwefel function No. 226 (SF26), Inverted cosine-wave function (ICW), Rastrigin function (RF), Dropwave function (DF), Langermann function (LF) Deceptive function (DCF), Michalewicz function (MWF) Schwefel function No. 222 (SWF222) SR rastrigin (SRRF), SR Keane’s bump CF-1(scaled composition of ten shifted and rotated Sphere functions), CF-2 (scaled composition of ten shifted and rotated Grewank functions), CF-3 (scaled composition of ten shifted and rotated Rastrigin functions), HCF-1 (scaled composition of two Ackley functions, two Rastrigin functions, two Weierstrass functions, two Griewank functions and two Sphere functions), HCF-2 (scaled composition of two Rastrigin functions, two Weierstrass functions, two Griewank functions, two Ackley functions, and two Sphere functions, HCF-3 has similar configuration as HCF-2, but with different σ Noisy sphere function (NSF), Noisy CF-1 (NCF-1), Noisy HCF-1 (NHCF-1), Noisy HCF-2 (NHCF-2) Keane’s bump function (KBF), Step-2 function (STF2)
Steep ridge/drop Unimodal Shifted and rotated (SR) Composite function (CF)
Noisy Other
Table 2: Parameters of optimizers
Optimizers
Parameters
PEO SLPSO [11] SFLA [17]
Nb = 35, rho =[0,1], ξ=2, M = 2, mutation prob = 0.05, Number of F1 and F2 generation offspring = 2 n c3 = m∗0.01 , m set to population size Number of memeplex = 2, Number of Parents = max(round(0.3*nPopMemeplex),2), nPopMemeplex = 50, Number of Offspring = 3, Maximum Number of Iterations =5, Step Size = 2 M = 2, crossover probability = 0.8, Mutation Probability = 0.05, hybridization of Roulette wheel, tournament, and Uniform selection, hybridization of Single-point, Two-points, and Uniform crossover Elitist Check=1, Rpower=1, Findex = 1, Rnorm =2 Number of Producers = 1, Number of Scroungers = 0.8, a = round(sqrt(n + 1)), tmax = api2 , amax = tmax /2, error = 0, stall = 50, verbose = 1 Discovery rate of alien eggs/solutions is 0.25, β = 3/2 Number of Scout Bees is equal to nPop, Recruited Bees Scale is as round(0.3 ∗ nS coutBee), Neighborhood Radius is defined as 0.1 ∗ (maxvariablelimit − minvariablelimit), Neighborhood Radius Damp Rate is fixed to 0.95 Size of one clan = Nv , Hybridization of Bijective DSA, Surjective DSA
MSGA GSA [22] GSO [21] CS [19] PBA [20] HDSO [32]
rithms have run 100 times and statistical comparative performance are presented to show the effectiveness of proposed optimizer. Parameters for the algorithms are given in Table- 2. Four attributes are taken to evaluate the efficiency of all above optimizers. First, the average of best value µ is calculated based on the achieved solution in all 100 individual runs. Second, the standard deviation σ of achieved solutions in 100 runs shows the spread of the obtained results. Third, the best solution B achieved by an optimizer in 100 individual runs. Fourth, the last attribute C gives the robustness and consistency of the algorithms in percentage (%) and represents the number of times that optimizer gives a result below a certain value, where here the mean value has been used for a better comparison. While the proposed optimizer outperforms the other optimizers on most of the functions, as it can be observed in Table- 3, it has a comparative performance on some functions rather than superiority for example HCF-2. The statistical comparative results of the solution to problems of thirty variables have shown in Table- 3. It can be observed from results that the proposed optimizer solves the undertaken functions efficiently with high C. It indicates the robustness of PEO for problems of a higher number of variables. Here, we can observe that for SWF222 and STF2 functions PEO achieves an acceptable solution which is comparative to SLPSO, SFLA, and GSA, where the other optimizers fail. Also, we can observe that the other algorithms suffer from the convergence problem on WF, SF26, ICW, RF, DF, LF, DCF, MWF,
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
152
and KBF, where PEO achieves the best solution in a robust way in each subsequent generation. In this table set, more complicated functions such as CF-1, CF-2, CF-3, HCF-1, and HCF-3 are efficiently solved by the PEO, where it gives the best performance for HCF-2 GSA and HDSO. Shifted and rotated (SR) functions like SRRF and SRKB are solved with better performance by PEO, where the other optimizers show limitation in all performance measures. We have observed that when we mix noise in the functions, for example, NSF, NHCF-1, and NHCF-2, the performance of PEO is noticeable. Overall, by the presented simulation results on all functions, we noticed that the PEO performs well for multi-modal functions, which are most complex functions in the literature so far. On the top of the above analytical points, we can observe from Table-3, that PEO posses the least standard deviation with the highest consistency on the most of the functions.
CF-3
CF-2
CF-1
SWF222
MWF
DCF
LF
DF
RF
ICW
SF26
WF
Table 3: Results on 30 variables benchmark test functions
µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B
PEO
SLPSO
SFLA
MSGA
GSA
GSO
CS
PBA
HDSO
2.06E-03 3.66E-03 1.11E-16 74% -1.26E+04 9.34E+00 -1.26E+04 92% -2.41E+01 1.62E+00 -2.77E+01 48% 3.27E+00 1.97E+00 3.59E-11 64% -2.69E+01 3.73E-01 -2.78E+01 63% -2.29E+02 1.20E+01 -2.49E+02 54% -9.60E-01 1.94E-02 -9.96E-01 49% -2.88E+01 2.28E-01 -2.93E+01 59% 6.69E-08 1.73E-07 8.90E-14 79% -2.76E+00 9.66E+00 -3.64E+01 48% 3.14E+01 3.95E+01 6.35E+00 80% 2.47E+02 6.82E+01 1.07E+02
6.40E-01 3.32E-02 5.54E-01 0% -1.11E+04 2.91E+02 -1.20E+04 0% -1.66E+01 1.56E+00 -2.01E+01 0% 1.54E+01 5.40E+00 6.97E+00 0% -2.34E+01 2.17E+00 -2.70E+01 4% -1.96E+02 1.48E+01 -2.20E+02 0% -6.90E-01 2.27E-02 -7.74E-01 0% -1.49E+01 9.77E-01 -1.72E+01 0% 6.53E-22 5.27E-22 4.83E-23 100% 1.09E+01 8.36E+00 2.70E-09 0% 2.32E+02 1.11E+02 1.00E+02 0% 4.23E+02 1.35E+02 8.82E+01
3.45E-01 4.58E-02 2.49E-01 0% -7.59E+03 1.03E+03 -9.43E+03 0% -1.54E+01 1.76E+00 -1.90E+01 0% 7.67E+01 2.05E+01 3.08E+01 0% -2.00E+01 1.76E+00 -2.41E+01 0% -1.42E+02 2.37E+01 -1.82E+02 0% -4.96E-01 1.89E-01 -8.14E-01 0% -2.49E+01 1.11E+00 -2.67E+01 0% 2.24E-24 4.32E-24 2.80E-26 100% 1.20E+02 6.49E+01 -3.84E+01 4% 2.56E+02 1.19E+02 5.82E-01 4% 3.14E+02 6.83E+01 1.80E+02
1.33E-01 3.08E-02 6.55E-02 0% -1.16E+04 3.09E+02 -1.21E+04 0% -2.15E+01 1.04E+00 -2.41E+01 0% 3.43E+01 9.02E+00 1.68E+01 0% -2.51E+01 7.39E-01 -2.65E+01 0% -2.10E+02 1.12E+01 -2.33E+02 4% -8.27E-01 2.71E-02 -8.86E-01 0% -2.66E+01 6.05E-01 -2.80E+01 0% 1.59E-02 8.55E-03 2.44E-03 0% 2.05E+01 8.91E+00 3.70E+00 0% 1.86E+02 1.27E+02 1.69E+01 24% 3.26E+02 9.25E+01 1.63E+02
1.12E-01 2.41E-02 6.51E-02 0% -3.00E+03 4.62E+02 -4.61E+03 0% -2.15E+01 9.45E-01 -2.35E+01 0% 8.73E+00 2.81E+00 1.99E+00 1% -2.65E+01 4.44E-01 -2.73E+01 12% -4.36E+01 7.08E+00 -6.72E+01 0% -5.74E-01 2.71E-02 -6.37E-01 0% -2.79E+01 1.02E+00 -2.91E+01 24% 5.46E-18 1.46E-18 2.19E-18 100% -2.45E-04 1.23E-03 -6.13E-03 0% 3.47E+02 1.13E+02 1.00E+02 0% 2.55E+02 4.07E+01 1.55E+02
5.29E-01 5.44E-02 3.88E-01 0% -4.55E+03 6.38E+02 -6.57E+03 0% -1.14E+01 1.74E+00 -1.54E+01 0% 1.60E+02 2.47E+01 9.27E+01 0% -1.57E+01 1.54E+00 -1.95E+01 0% -6.31E+01 9.02E+00 -8.41E+01 0% -3.45E-01 2.87E-02 -4.09E-01 0% -1.51E+01 1.57E+00 -1.82E+01 0% 8.81E+01 4.09E+01 3.21E+01 0% 1.28E+02 5.17E+01 5.71E+00 0% 7.16E+02 6.51E+01 5.34E+02 0% 1.01E+03 1.03E+02 7.59E+02
4.46E-01 2.32E-02 3.57E-01 0% -8.41E+03 1.74E+02 -8.89E+03 0% -1.22E+01 6.47E-01 -1.42E+01 0% 1.13E+02 1.11E+01 8.59E+01 0% -1.49E+01 6.98E-01 -1.67E+01 0% -1.08E+02 6.19E+00 -1.24E+02 0% -6.32E-01 1.38E-02 -6.72E-01 0% -1.66E+01 5.77E-01 -1.78E+01 0% 4.93E+01 1.77E+01 1.81E+01 0% 1.14E+02 5.81E+01 -1.04E+02 4% 4.46E+01 1.32E+01 3.06E+01 4% 3.81E+02 4.48E+01 3.11E+02
4.23E-01 2.47E-02 3.60E-01 0% -2.58E+03 4.74E+02 -4.14E+03 0% -3.49E+00 6.83E-01 -6.33E+00 0% 1.44E+02 1.84E+01 7.86E+01 0% -6.17E+00 6.75E-01 -8.34E+00 0% -6.18E+01 6.04E+00 -7.87E+01 0% -2.62E-01 2.18E-02 -3.27E-01 0% -6.38E+00 6.95E-01 -8.32E+00 0% 3.06E+02 9.03E+01 1.17E+02 0% 3.15E+01 2.90E+01 -1.02E+00 0% 5.66E+02 7.65E+01 4.28E+02 0% 8.78E+02 7.85E+01 7.65E+02
1.73E-01 3.57E-02 8.69E-02 0% -9.88E+03 5.26E+02 -1.10E+04 0% -5.99E+00 4.89E-01 -7.40E+00 0% 3.61E+01 7.17E+00 2.06E+01 0% -2.33E+01 6.29E-01 -2.50E+01 0% -1.70E+02 1.17E+01 -1.96E+02 0% -7.31E-01 2.28E-02 -7.86E-01 0% -9.65E+00 5.44E-01 -1.11E+01 0% 4.26E+01 4.40E+01 3.97E+00 0% 9.30E+01 6.02E+01 -7.97E+01 4% 5.07E+01 3.50E+01 1.14E+01 40% 4.44E+02 6.08E+01 2.91E+02
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
153
STF2
KBF
NHCF-2
NHCF-1
NSF
SRKB
SRRF
HCF-3
HCF-2
HCF-1
Table 3: Results on 30 variables benchmark test functions
C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C µ σ B C
PEO
SLPSO
SFLA
MSGA
GSA
GSO
CS
PBA
HDSO
60% 3.76E+02 8.84E+01 2.46E+02 57% 2.21E+01 6.82E+00 6.52E+00 52% -1.01E+01 5.87E+01 -1.81E+02 52% 8.01E+00 2.41E+00 3.95E+00 50% -5.79E+02 5.80E+00 -5.87E+02 52% -1.43E+02 6.34E+02 -4.14E+03 12% 2.90E+00 4.54E+01 -1.31E+02 40% -3.71E+00 8.55E+00 -3.86E+01 28% -8.09E-01 7.87E-03 -8.20E-01 0.63% 0.00E+00 0.00E+00 0.00E+00 100%
16% 4.89E+02 5.52E+01 4.21E+02 0% 2.39E+01 7.84E+01 8.56E-01 96% 1.33E+02 4.83E+01 3.15E+01 0% 9.41E+00 3.36E+00 4.80E+00 38% -5.42E+02 1.54E+01 -5.64E+02 0% 2.42E-11 6.57E-11 6.76E-13 0% 9.55E+01 3.25E+01 2.27E+01 0% 2.80E+00 9.37E-01 5.69E-01 0% -4.78E-01 9.71E-02 -7.98E-01 0 0.00E+00 0.00E+00 0.00E+00 100%
12% 4.64E+02 8.34E+01 3.26E+02 17% 2.45E+01 7.83E+01 5.38E+00 96% 2.30E+02 7.83E+01 2.23E+01 0% 1.40E+02 3.35E+01 4.95E+01 0% -3.47E+02 3.51E+01 -4.04E+02 0% 2.09E+03 2.25E+03 7.00E+01 0% 2.08E+02 8.68E+01 2.24E+01 0% 1.64E+01 1.33E+01 -8.65E-01 0% -4.86E-01 1.21E-01 -7.55E-01 0 1.83E+00 1.16E+01 0.00E+00 73%
20% 4.71E+02 7.63E+01 3.34E+02 13% 2.91E+01 5.67E+00 1.81E+01 16% 1.52E+02 5.61E+01 3.80E+01 0% 3.11E+01 7.08E+00 1.99E+01 0% -4.67E+02 2.42E+01 -5.12E+02 0% 2.45E+02 2.52E+02 -7.42E+01 0% 1.20E+02 3.73E+01 3.28E+01 0% 9.46E+00 6.32E+00 -2.68E+00 0% -7.45E-01 2.47E-02 -7.97E-01 0 1.97E+00 1.38E+00 0.00E+00 16%
32% 3.92E+02 5.20E+01 2.93E+02 17% 4.41E+00 2.47E+00 1.38E+00 100% 1.15E+02 5.22E+01 -5.15E+01 4% 2.72E+01 4.60E+00 1.90E+01 0% -3.30E+02 1.80E+01 -3.64E+02 0% 1.00E+00 4.42E+00 -2.24E-05 0% 7.10E+01 2.71E+01 1.20E+01 0% -6.67E-01 8.72E+00 -4.23E+01 4% -3.88E-01 2.10E-02 -4.32E-01 0 0.00E+00 0.00E+00 0.00E+00 100%
0% 9.11E+02 9.26E+01 7.38E+02 0% 4.97E+02 1.63E+02 3.05E+02 0% 1.80E+02 9.70E+01 1.21E+00 0% 1.71E+02 3.61E+01 1.08E+02 0% -1.63E+02 2.18E+01 -1.92E+02 0% 5.13E+03 2.74E+03 1.49E+03 0% 1.68E+02 6.99E+01 4.94E+01 0% 1.63E+02 1.02E+02 -4.84E+01 8% -2.25E-01 2.76E-02 -3.40E-01 0 5.26E+03 1.25E+03 2.61E+03 0%
0% 5.48E+02 3.46E+01 4.61E+02 0% 2.79E+01 3.40E+00 2.18E+01 4% 1.64E+02 1.05E+02 -7.49E+01 12% 1.91E+02 2.41E+01 1.26E+02 0% -2.70E+02 1.25E+01 -3.03E+02 0% 7.19E+03 2.53E+03 -3.03E+03 2% 1.34E+02 1.17E+02 -1.60E+02 8% 5.05E+01 2.62E+01 8.85E+00 0% -4.46E-01 4.11E-03 -4.67E-01 0 5.35E+01 1.50E+01 2.00E+01 0%
0% 8.59E+02 9.65E+01 5.38E+02 0% 4.33E+02 7.42E+01 2.80E+02 0% 4.49E+01 2.85E+01 4.78E+00 0% 1.35E+02 1.83E+01 8.55E+01 0% -7.55E+01 1.22E+01 -1.02E+02 0% 3.26E+03 1.90E+03 8.77E+01 0% 3.57E+01 2.25E+01 3.78E+00 0% 3.27E+01 2.61E+01 -7.35E+00 8% -1.40E-01 1.17E-02 -1.68E-01 0 2.26E+04 3.99E+03 9.14E+03 0%
0% 5.39E+02 4.49E+01 4.53E+02 0% 1.69E+01 2.49E+00 1.14E+01 100% 2.03E+02 1.03E+02 -7.85E+01 4% 1.80E+02 2.79E+01 1.29E+02 0% -4.08E+02 2.41E+01 -4.56E+02 0% 6.89E+03 4.35E+03 -3.61E+03 2% 1.99E+02 7.09E+01 4.45E+01 0% 4.69E+01 1.83E+01 2.11E+01 0% -2.64E-01 1.90E-02 -3.21E-01 0 3.35E+00 1.33E+00 0.00E+00 1%
4. Statistical ranking We have experimented the proposed optimizer on a variety of functions as given in Table-1. It is worth to discuss the features and limitations of the proposed optimizer based on the experimental results. For this, we rank the optimizers statistically on their average performance. For each function, we obtain an efficiency number for each optimizer according to its rank in the range from one to nine assigned to performances from worst to best, respectively. This is because we have nine optimizers in this comparative analysis. After grading all optimizers for each function, the average grade for each optimizer is calculated. The navigated scores and the corresponding rankings are presented in Table-4. For the test functions shown in Table- 3, PEO scores 8.77 points and secures the first position, where the others are far behind. GSA, MSGA, and SLPSO are at the second, third and fourth positions with 6.14, 4.73 and 6.36, respectively. Moreover, to incorporate standard deviation along with mean value in the ranking process, A-TOPSIS analysis [33] is utilized, where ζ scores the optimizers. It includes the effect of µ and σ as 60% and 40%, respectively for precise
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
154
Table 4: Statistical ranking of the optimizers
Methods Scoring A-TOPSIS
Rank
1
2
3
4
5
6
7
8
9
Optimizers S Optimizers ζ
PEO 8.77 PEO 0.63
MSGA 6.36 MSGA 0.61
GSA 6.32 SLPSO 0.56
SFLA 6.14 GSA 0.43
SLPSO 4.73 SFLA 0.42
HDSO 4.23 HDSO 0.34
CS 3.64 CS 0.26
PBA 2.68 PBA 0.16
GSO 2.14 GSO 0.15
ranking of the optimizers. It is based on the hypothesis that µ should be dominating. Acquired ζ values obtained using A-TOPSIS algorithm are given in Table-4 for the functions presented in Table-3. According to the results obtained by this method, PEO appears as the best performer. 5. Future research scope PEO has a considerable scope of further improvement as discussed in this section. The first stage of improvement belongs to the pollination scheme, where two species and individuals of two species are selected to fertilize. To introduce more diversity, random pollination, tournament pollination, roulette wheel pollination etc. can be tested. The second improvement is at the stage of producing F1 generation offspring. Unlike GA, different offspring production techniques such as single-point, multi-point, uniform, mid-point techniques can be tested. Third, optimizing the parameters of PEO, especially Mendelian probability and mutation probability which can be an impactive factor for increasing the efficiency of the optimizer. Moreover, the performance of the PEO can be increased by the methods given in [34, 35, 36], whereby utilizing binary tree memory and spline interpolation [37] function evaluations could be minimized. Also, the real coded algorithm of the proposed evolutionary computation method may have vital impact to solve the complex functions more efficiently. Thus, at this juncture, we can observe that PEO has a wide scope of further improvement and maybe a dominant optimization algorithm for most of the large-scale complex problems from sociology, engineerings, Topology, Graph, Biology, and etc. 6. Conclusion This paper presents a new bio-inspired metaheuristic binary coded evolutionary algorithm for optimization purposes . This algorithm imitates the evolutionary theory in plant genetics, where the recessive genes are transmitted to next generation with some probability. Five operators as named Flipper, Pollination, Breeding, Discrimination, and Epimutation are introduced and sequentially implemented to design the evolutionary computational algorithm. PEO works on two-chromosomes DNA structure instead of a single chromosome, which are opposite of each other in nature. The concept of recessive genes transfer helps the optimizer not to deviate from the already best-achieved solution in previous evolution. Additionally, Epimutation exploits the neighboring points based on Epimutation probability. To benchmark the proposed optimizer, we have compared it with eight optimizers of different nature, as well as a diverse set of complex benchmark test problems. Simulation results on thirty variables test problems indicate the higher efficiency of the PEO than the other optimizers. For multimodal complex, non-convex, non-differentiable, noisy and deceptive in nature functions PEO shows good performance with higher consistency. In benchmarking PEO on rotated-shifted, composite and hybrid problems which are known as difficult problems, we found that PEO outperforms the other optimizers. In conclusion, PEO as a nature-inspired genetic evolutionary algorithm which utilizes the standard structure of genotype and phenotypes for a double-strand DNA can be used on wide range of the problems without any special guidelines. Future works are expected to be done on the parametric control, complexity analysis, convergence analysis, testing the algorithm for higher variables and its deployment for solving real-life problems. References [1] I. BoussaD, J. Lepagnot, & P. Siarry, “A survey on optimization metaheuristics,” Information Sciences, vol. 237, pp. 82–117, 2013.
Neeraj Gupta et al. / Procedia Computer Science 126 (2018) 146–155
155
[2] N. Gupta, N. Patel, B. N. Tiwari, M. Khosravy, “Genetic Algorithm based on Enhanced Selection and Log-scaled Mutation Technique”, Future Technologies Conference (FTC), 13–14 November 2018. [3] S. Kirkpatrick, C.D. Gelatt, & M.P. Vecchi, “Optimization by simulated annealing,” science, vol. 220, no. 4598, pp. 671-680, 1983. [4] N. Krasnogor, & J. Smith, “A tutorial for competent memetic algorithms: model, taxonomy, and design issues,” IEEE Transactions on Evolutionary Computation, vol. 9, no. 5, pp. 474–488, 2005. [5] M.S. Hussin, & T. Sttzle, “Tabu search vs. simulated annealing as a function of the size of quadratic assignment problem instances,” Computers & operations research, vol. 43, pp. 286–291, 2014. [6] M. Dorigo, & C. Blum, “Ant colony optimization theory: A survey,” Theoretical computer science, vol. 344, no. 2–3, pp. 243–278, 2005. [7] M.R. AlRashidi, & M.E. El-Hawary, “A survey of particle swarm optimization applications in electric power systems,” IEEE transactions on evolutionary computation, vol. 13, no.4, pp. 913–918, 2009. [8] E.K. Burke, M. Gendreau, M. Hyde, G. Kendall, G. Ochoa, E. zcan, & R. Qu, “Hyper-heuristics: A survey of the state of the art,” Journal of the Operational Research Society, vol. 64, no.12, pp. 1695–1724, 2013. [9] D. H. Wolpert, & W.G. Macready, “No free lunch theorems for optimization,” IEEE transactions on evolutionary computation, vol. 1, no.1, pp. 67–82, 1997. [10] J. H. Holland, “Adaptation in Natural and Artificial Systems”,University of Michigan, MIT Press, Cambridge, 1975.The first european conference on artificial life, pp. 245–254, 1992. [11] Li, C., Yang, S. and Nguyen, T.T., A self-learning particle swarm optimizer for global optimization problems. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 42(3), pp.627-646, 2012. [12] S. M. Islam, S. Das, S. Ghosh, S. Roy, & P.N. Suganthan, “An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization, IEEE Trans. on Sys., Man, and Cyber., Part B (Cybernetics), 42(2), pp. 482–500, 2012. [13] M. Daneshyari, & G.G. Yen, “Cultural-based multiobjective particle swarm optimization,” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 41, no. 2, pp. 553–567, 2011. [14] D. Manjarres, I. Landa-Torres, S. Gil-Lopez, J. Del Ser, M.N. Bilbao, S. Salcedo-Sanz, & Z.W. Geem, “A survey on applications of the harmony search algorithm,” Engineering Applications of Artificial Intelligence, vol. 26, no. 8, pp. 1818–1831, 2013. [15] D. Karaboga, B. Gorkemli, C. Ozturk, & N. Karaboga, “A comprehensive survey: artificial bee colony (ABC) algorithm and applications,” Artificial Intelligence Review, vol. 42, no. 1, pp. 21–57, 2014. [16] M. Neshat, G. Sepidnam, M. Sargolzaei, & A.N. Toosi, “Artificial fish swarm algorithm: a survey of the state-of-the-art, hybridization, combinatorial and indicative applications,” Artificial Intelligence Review, pp. 1–33, 2014. [17] M. Eusuff, K. Lansey, and F. Pasha, “Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization,” Engineering optimization, vol. 38, no. 2, pp. 129–154, 2006. [18] D. Simon, “Biogeography-based optimization,” IEEE transactions on evolutionary computation, vol. 12, no. 6, pp. 702–713, 2008. [19] R. Rajabioun, “Cuckoo optimization algorithm,” Elsevier, Applied soft computing, vo. 11, no. 8, pp. 5508–5518, 2011. [20] Yang, Xin-She. ”Bat algorithm for multi-objective optimisation.” International Journal of Bio-Inspired Computation 3, no. 5 (2011): 267-274. [21] S. He, Q. H. Wu, & J. R. Saunders. ”Group search optimizer: an optimization algorithm inspired by animal searching behavior.” IEEE Trans. on Evolutionary Computation 13.5 (2009): 973-990. [22] Rashedi, E., Nezamabadi-Pour, H. and Saryazdi, S. GSA: a gravitational search algorithm. Information sciences, 179(13), pp.2232-2248, 2009. [23] M. F. Campanile, N.G. Lederman & K. Kampourakis, “Mendelian genetics as a platform for teaching about Nature of Science and Scientific Inquiry: The value of textbooks,” Science & Education, vol. 24, no. 1–2, pp. 205-225, 2015. [24] R. R. Sinden, “DNA structure and function,” Elsevier, 2012. [25] R. Frankel, & E. Galun, “Pollination mechanisms, reproduction and plant breeding”(vol. 2) Springer Science & Business Media, 2012. [26] H. Oey, & E. Whitelaw, “On the meaning of the word epimutation,” Trends in Genetics,, vol. 30, no. 12, pp. 519–520, 2014. [27] E. Higgs, “Nature by design: people, natural process, and ecological restoration,” MIT Press, 2003. [28] S. Calo, C. Shertz-Wall, S.C. Lee, R.J. Bastidas, F.E. Nicols, J.A Granek, P. Mieczkowski, S. Torres-Martinez, R.M. Ruiz-Vzquez, M.E. Cardenas & J. Heitman, “Antifungal drug resistance evokedvia RNAi-dependent epimutations” Nature, vol. 513, no. 7519, p.555, 2014. [29] P. N. Suganthan, N. Hansen, J. J. Liang, K.Deb, Y. P. Chen, A. Auger, and S. Tiwari, Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on RealParameter Optimization, Technical Report, Nanyang Technological University, Singapore, http://www.ntu.edu.sg/home/EPNSugan, 2005 [30] http://al-roomi.org/benchmarks/unconstrained/n-dimensions [31] K. Tang, X. Yao, P. N. Suganthan, C. MacNish, Y. P. Chen, C. M. Chen, Z. Yang, “Benchmark Functions for the CEC2008 Special Session and Competition on Large Scale Global Optimization,” http://nical.ustc.edu.cn/cec08ss.php, 2008. [32] P.Civicioglu, ”Transforming geocentric cartesian coordinates to geodetic coordinates by using differential search algorithm”, Computers & Geosciences, 46 , pp. 229-247, 2012. [33] A. Kelemenis, & D. Askounis, “A new TOPSIS-based multi-criteria approach to personnel selection,” Expert systems with applications, vol. 37(7), pp.4999-5008, 2010. [34] S.H. Cha, & C.C. Tappert, “A genetic algorithm for constructing compact binary decision trees,” Journal of Pattern Recognition Research, vol. 4, no. 1, pp. 1–13, 2009. [35] S. Yang, “Genetic algorithms with memory-and elitism-based immigrants in dynamic environments,” Evolutionary Computation, MIT Press, vol. 16, no. 3, pp. 385–416, 2008. [36] C. Thangamani and M. Chidambaram, “A Novel Hybrid Genetic Algorithm with Weighted Crossover and Modified Particle Swarm Optimization,” Artificial Intelligent Systems and Machine Learning, vol. 9, no. 2, pp. 25–30, 2017. [37] V.B. Gantovnik, Z. Grdal & L.T. Watson, “A genetic algorithm with memory for optimal design of laminated sandwich composite panels,” Elsevier, Composite Structures, vol. 58, no. 4, pp. 513–520, 2002.