Hierarchical Learning Water Cycle Algorithm

Hierarchical Learning Water Cycle Algorithm

Journal Pre-proof Hierarchical learning water cycle algorithm Caihua Chen, Peng Wang, Huachao Dong, Xinjing Wang PII: DOI: Reference: S1568-4946(19)...

2MB Sizes 0 Downloads 91 Views

Journal Pre-proof Hierarchical learning water cycle algorithm Caihua Chen, Peng Wang, Huachao Dong, Xinjing Wang

PII: DOI: Reference:

S1568-4946(19)30716-1 https://doi.org/10.1016/j.asoc.2019.105935 ASOC 105935

To appear in:

Applied Soft Computing Journal

Received date : 26 May 2019 Revised date : 23 September 2019 Accepted date : 11 November 2019 Please cite this article as: C. Chen, P. Wang, H. Dong et al., Hierarchical learning water cycle algorithm, Applied Soft Computing Journal (2019), doi: https://doi.org/10.1016/j.asoc.2019.105935. This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

© 2019 Published by Elsevier B.V.

*Highlights (for review)

Journal Pre-proof

Highlight



The updating mechanisms are diversified based on the hierarchical difference of

solutions. For the purpose of better exploring the whole design space, the high level

pro of



solutions utilize a more exploration-inclined updating mechanism. 

On the other hand, the low level solutions adaptively adjust their searching

abilities between exploration and exploitation according to the characteristics of their learning targets. 

The performance of the proposed algorithm is test on CEC 2017 benchmark suite

re-

and four well-known engineering and mechanical design problems. The results show

Jo

urn a

lP

that our proposed algorithm is very competitive.

*Manuscript Click here to view linked References

Journal Pre-proof

Hierarchical Learning Water Cycle Algorithm Caihua Chen, Peng Wang*, Huachao Dong, Xinjing Wang Northwestern Polytechnical University School of Marine Science and Technology Xi’an, China

pro of

Emails: [email protected]; [email protected]*; [email protected]; [email protected]

re-

Abstract In order to improve the global searching ability of Water Cycle Algorithm (WCA), the hierarchical learning concept is introduced and the Hierarchical Learning WCA (HLWCA) is proposed in this paper. The underlying idea of HLWCA is to divide the solutions into collections and give these collections with hierarchy differences. One of the collections has a higher hierarchy than others and utilizes an exploration-inclined updating mechanism. The solutions in this high hierarchy collection are the exemplars of other collections. The other collections are sorted according to the exemplars’ function value and the solutions in these collections actively choose whether to follow their own exemplar or not. Through different updating mechanisms of collections, the global searching ability is improved while the fast convergence and strong local search ability of WCA are retained. The proposed HLWCA is firstly experimented on IEEE CEC 2017 benchmark suite to testify its performance on complex numerical optimization tasks. Then, it is tested on four practical design benchmark problems to verify its ability of solving real-world problems. The experimental results illustrate the efficiency of the proposed algorithm. Keywords Metaheuristic, Water Cycle Algorithm, Hierarchical Learning, Active Target Choosing

lP

1.Introduction

Jo

urn a

Optimization plays important roles in many fields of science and engineering because optimization tasks exist in numerous real-world problems. Most real-world optimization problems are characterized by nonlinearity, non-convexity, discontinuity and noise, so that it is usually hard or expensive to find the exact optimum. Therefore, finding the near-optimal solution in a reasonable amount of computational resources is an effective way to solve these problems. Since metaheuristic algorithms can obtain near-optimal solutions in limited resources [1][2][3][4], they are very suitable and have been widely used to solve such complex problems over the past few decades. These metaheuristics are usually inspired by nature phenomena, animal behaviors or physical criteria [5][6], such as Genetic Algorithm (GA) [7][8], Differential Evolution (DE) [9], Artificial Bee Colony (ABC) [10][11], Particle Swarm Optimization (PSO) [12], Simulated Annealing (SA) [13], Harmony Search (HS) [14], and so on. Water Cycle Algorithm (WCA), proposed by Eskandar in 2012, is a population-based optimization technique inspired by the water cycle in real world [15]. In natural world, the streams flow to the rivers and the rivers flow to the sea. Based on this observation of the water cycle process, WCA categorizes its solutions into three kinds: , and . Besides, WCA mimics the stream-river-sea flow pattern as its information transfer framework: learns from , and learns from . Due to its inherent fast convergence tendency and simple implementation [16][17][18][19], WCA and its variants have been successfully applied in many fields such as constrained multi-objective optimization problems [20][21], the chaos suppression [22], the optimal reactive power dispatch problem [23], clustering analysis [24], long-term multi-reservoir optimization[25][26][27], and etc. Although WCA has been proved effective in handling many optimization tasks, it still has the problem of premature convergence when dealing with complex multimodal problems. Similar to other metaheuristics, the key to successfully process optimization depends heavily on a delicate balance between intensification and diversification of solutions [28][29][30]. In general, premature convergence occurs due to the poor exploration

Journal Pre-proof

re-

pro of

capability which leads to a poor diversification of the solutions. WCA has exhibited excellent convergence speeds and exploitation capability, hence it is necessary to strengthen its exploration capability to avoid premature convergence and then improve its overall performance. Having the fact of balancing the exploration and exploitation, the premature problem could be resolved under the hypothesis that: part of the solutions preserve the local searching ability while the remaining solutions adopt more effective global searching strategies. Based on this hypothesis, a hierarchical learning strategy is introduced into WCA and Hierarchical Learning Water Cycle Algorithm (HLWCA) is proposed in this paper. The main contributions of this paper can be summarized as follows: 1) For the solutions categorized as river ( s), they adopt new updating concept to enhance their exploration trend; 2) For the solutions categorized as stream ( s), according to the characteristics of their guider , they use an active target choosing strategy to determine whether to learn from their corresponding , and this helps to increase the global searching abilities while reduces some unnecessary searching. In order to validate the performance of HLWCA, the CEC 2017 benchmark suite is used to verify the numerical optimization performance. An extensive comparison to other algorithms is also made. Then four well-known engineering and mechanical benchmark problems are tested to prove its ability for solving real-world tasks. The rest of this paper is organized as follows: In Section 2, there is a brief review of WCA. In Section 3, the proposed HLWCA is discussed in detail. The experimental study and the result discussions are presented in Section 4. Finally, the conclusions are presented in Section 5.

2.Water Cycle Algorithm

WCA starts with an initial population which has

randomly generated solutions in the searching space.

urn a

lP

The best solution of the population is chosen as the sea, and is represented as ; according to a user defined parameter ( ), a number of good solutions whose function values are close to that of are considered as rivers ( s); then, the rest solutions are classified as streams ( s), and a or the will be the exemplar of a . Therefore, the WCA has three different types of solutions: , which is the best solution in each iteration; s, which are suboptimal solutions and can be seen as local optima; and s, which learn from the or in each iteration. A schematic view of the solutions in WCA is shown in Fig. 1.

Jo

Fig. 1. The solution matrix structure of WCA

Depending on the initial function values of each and . The number of assigned

and , different number of s is calculated as follows:

1 Cn  f  n   f  X stream 



NS  n   round Cn



N sr i 1

s are assigned to

(1)



Ci   N pop  N sr 

(2)

where , is the function value. As can be seen in nature, streams often firstly flow to rivers and then rivers flow to the sea. But there still are

Journal Pre-proof some streams that may flow directly to the sea. Therefore, the new position for as follows:

s and

X river  t  1  X river  t   rand  c   X sea t   X river t  

s can be given

(3)

X stream  t  1  X stream  t   rand  c   X river t   X stream t  

(4)

X stream  t  1  X stream  t   rand  c   X sea t   X stream t  

(5)

pro of

where rand is a uniformly distributed random number between 0 and 1, c is a positive constant between 1 and 2 (usually set as 2). If a has better function value than its exemplar , the exchanges its position and function value with the . Similar exchanges can also happen between and , and between and . Fig. 2 illustrates a schematic view of the updating process of WCA, where diamond, stars and circles represent , s and s, respectively. The solid shapes represent the old positions while the hollow ones represent the new positions.

re-

new position for river

lP

sea

new position for stream

Fig. 2. A schematic view of the updating process in the WCA [23]

urn a

To enhance the searching ability of WCA, a regeneration process called evaporation is defined. If the distance between and is smaller than a predefined value ( ), the position of and its guided s will be randomly regenerated using the following equation:

X new  t  1  LB  rand  UB  LB 

Evaporation process also happens on s that directly learn from . The new position of will be defined as follows:

if the distance is also smaller than

(7)

Jo

new X stream  t  1  X sea  t     randn 1, D 

(6)

where is a coefficient defining the searching region near the . A suitable value is set to 0.1. In fact, the regeneration of s that directly learn from improves the exploitation capability of WCA, and helps to obtain more accurate results. The regeneration of and its guided s improves the exploration capability and helps to escape local optima. controls the search accuracy around and a gradually reduced value can encourage WCA to get more precise results. The value of decreases as follows:

d max  t  1  d max  t  

d max  t  Max.iteration

(8)

Journal Pre-proof where is the maximum number of the iterations. In summary, a notable feature of WCA is the category of solutions and it allows different types of solutions to undertake different searching tasks Algorithm 1 WCA

12: 13: 14: 15: 16: 17: 18: 19:

and

generate solutions randomly and calculate their function values sort the solutions in ascending order and categorize them into

(

); ,

calculate using (2) assign to corresponding or ; /*main loop*/ while termination condition is not satisfied do /*updating process*/ update s guided by s using (4) and check whether improve; update s guided by using (5) and check whether improves; update s using (3) and check whether improves; /*evaporation process*/ check whether s and its guided s need regeneration check whether s guided by need regeneration /*coefficient update*/ update using (8) end while output

pro of

11:

) and the sum of

and

s

re-

5: 6: 7: 8: 9: 10:

/*initialization*/ set population number (

lP

1: 2: 3: 4:

urn a

3.Hierarchical Learning Water Cycle Algorithm

Same as the original WCA, the HLWCA starts with randomly generated solutions in the searching space. Then, their function values are calculated and sorted in an ascending sequence. The top solutions are selected and put in the collection . According to Eq.(2), the is calculated. Afterwards, the rest solutions are divided into collections and each collection (without special mention, the i in the following content always represents ) contains solutions. As shown in Fig. 3, all the solutions are divided into collections. Obviously, the in original WCA corresponds to and the s correspond to in , respectively. Similarly, the s in WCA correspond to the solutions in each . And, the solution is

Jo

the behavior exemplar of solutions in . In other words, is the collection used to store behavior exemplars. At this point, the initialization is completed, and the proposed algorithm begins the main loop.

pro of

Journal Pre-proof

Fig. 3. The solution matrix structure of HLWCA

At each iteration, the solutions in will be sorted in an ascend sequence according to the function values. Based on the new order, the collections will also reorganize their sequence orders. It should be noticed that the collections only change their index orders, but the solutions in each collection remain unchanged. Then, the solutions in update their positions. For a solution in collection , it updates its position using the following equations:

re-

xi,dj  t  1  xi,dj  t   xi,dj  t  1

 

lP

r  c  x0, di   t   xi,dj  t   xi,dj  t  1   d  d  r  c  x0,k  t   xi , j  t   is the jth solution in collection of the kth solution in level

 

,

, k is a random integer between

pi , j  Pi

(9)

(10)

else represents the dth dimensional element

,

is a random number between

urn a

and is used to determine whether the solution learns from its own exemplar, is a random number between , c is a positive constant between . In the Eq.(15), is an adaptive parameter and this parameter is also the quality coefficient of the exemplar. The quality coefficient of each solution in is calculated as follows.

Di  X 0,i  X 0,1

Jo

di 

Di max  D 

costi  f  X 0,i   f  X 0,1 

ci 

costi max  cost 

 1  Pi   di  e  ci   2

(11)

(12)

(13)

(14)

i 1 (15)

else

Journal Pre-proof where is the norm distance, is the function value. This quality coefficient is used as a selection probability threshold for the solutions in to actively judge whether learns from the behavior exemplar or other better exemplar. As can be seen from (13), the quality

pro of

coefficient consists of two components. The first component calculates the relative distance to the best solution while the second component calculates the relative function value discrepancy to the best solution. This coefficient increases as the distance increases and decreases as the function value discrepancy increases. Thus, for the solution in which has the better value and is also far away from the best solution, it will have a larger quality coefficient value. The larger this quality coefficient is, the more worthwhile the neighborhood of this solution is. Through this adaptive parameter and the active target choosing strategy, the solution in can increase the global searching ability and avoid some unnecessary search around the less desirable solution. After the solutions in update their positions, there is a renew process for the exemplars. The best solution in collection will compare with the collection exemplar . If that solution has better function value than ,

re-

they will exchange their position and function value information. When solutions in all the collections except have updated the positions, the solutions in will be updated. Since the solutions in are the behavior exemplars of the rest solutions, so they largely determine the overall performance of the algorithm. If they have an exploitation tendency, the algorithm will behave more exploitatively and vice versa. In order to balance the exploration and exploitation tendency, a different updating mechanism is introduced. For the solutions in , they no longer learn from a fixed global best as the original WCA does ( always learn from ). These solutions will choose any better solutions in . Each solution in updates its position using the following equations:

x0, di   t  1  x0, di   t   x0, di   t  1

lP

(16)

x0, di   t  1  r1  x0, di   t   r2  li  t   r3  mi  t 

(17)

li  t   x0, dk  t   x0, di   t 

(18)

d 

where

,

,

urn a

mi  t   x 0

 t   x0, di  t 

are random number between 0 and 1,

value of . Besides, unlike the original WCA that

(19) ,

is the mean behavior

s always accept updates, the solution

in

only accepts

effecting update which means that it only accepts the better result:

where

Jo

 X 0,i  t  1 X 0,i  t  1    X 0,i  t 

function value. After all the solutions in best will exchange with .

f  X 0,i  t  1   f  X 0,i  t   else

represents the ith solution in collection

,

(20)

is the corresponding

have updated, a reselection of best within the collection is carried out. The new

The main process of HLWCA is illustrated as Fig. 4 and the pseudocode is presented in Algorithm 2. As is shown in the figure and pseudocode, the regeneration process used in WCA is also retained in our algorithm. It can

Journal Pre-proof

Jo

urn a

lP

re-

pro of

be seen that there is an obvious hierarchy difference within the collections: has higher level than others. This difference comes from the categorization of the solutions and it decides that the solutions in are the exemplars. These exemplar solutions learn within the collection and never learn from solutions in low level collections. Besides, for the rest collections, there also are slight hierarchy differences between them. At each iteration, collections reorganize according to the function values of their exemplars. As i increases, the function value of the exemplar of each collection becomes bigger. Furthermore, since the adaptive controls the sum number that the solutions in learn from their corresponding exemplar, the solutions in will have more possibilities to learn from other exemplars as i increases. At the same time, as i increases, the available exemplars for each collection will also increase. In short, as i increases, the solutions in can learn from more exemplars and the overall exploring ability of is also enhanced.

Fig. 4. Flowchart of the proposed HLWCA

Algorithm 2 HLWCA 1: 2: 3: 4: 5: 6:

/*initialization*/ set solution number (

) and the size of

(

);

generate solutions randomly and calculate their function values sort all solutions and place the top into calculate using (2) divide the rest solutions into collections;

Journal Pre-proof

/* regeneration process */ check whether the solutions in check whether the solutions in /* coefficient updating */ update using (8) end while output

pro of

/*main loop*/ while termination condition is not satisfied do /* sort and reorganization*/ sort the solutions in according to function values reorganize the other collections /*solution updating*/ calculate the quality coefficient of solutions in using (13) update the solutions in each collection using (14) renew all exemplars update the solutions in using (16) reselect the best solution in as need regeneration need regeneration

re-

7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24:

In order to verify the efficiency of the ability to obtain global optimum, we implement the following experiment. In this experiment, the exemplar solutions in always accept the updates in each iteration and we nominate this modified version as counterpart algorithm. The test function used here is the Function 10 of the CEC benchmark suite. The dimensionality is set to 2 for easy visualization. The parameters are set to , ,

lP

, , the maximum function evaluation is set to 20000. From the contour of the function, we can figure out that this function has plenty of local optima and is hard to find the true global optimum. In the figures, the current best solution is represented by the red hexagon; the exemplar solutions are represented by blue circles; the rest solutions are represented by black dots. As can be seen from Fig. 5, the solutions quickly converge to after finding a local optimum. Although there are attempts of escaping from the local optima, it

urn a

does not work well. The diversification of the whole solutions is severely undermined. On the contrary, when the exemplar solutions only accept better results, the diversification of the solutions was maintained during different optimization stages. As can be seen from Fig. 6, HLWCA missed the global optimum at the early stage. However, due to the preservation of the solutions’ diversification, the algorithm adaptively adjusted the searching process and finally found the true optimum. At the same time, it can be seen in Fig. 6 that during the iterations the amount of solutions around each exemplar is different. Because the best solution was changed, the P values, which were

Jo

calculated by Eqs. 15, were also changed and these P values adaptively controlled the solution whether to learn from its corresponding exemplar or other better exemplars. Table 1 is the statistical results of 51 runs and the results prove that the better result acceptance strategy used for the exemplar solution is effective.

lP

re-

pro of

Journal Pre-proof

Fig. 5. Position of solutions in counterpart algorithm at different stages (red hexagon is the best solution; blue circles are the exemplar

Jo

urn a

solutions; black dots are the rest solutions)

lP

re-

pro of

Journal Pre-proof

Fig. 6. Position of solutions in HLWCA at different stages (red hexagon is the best solution; blue circles are the exemplar solutions; black dots are the rest solutions)

Table 1

urn a

Experimental results on 2D Function 10 Counterpart

HLWCA

Algorithm

Mean

7.402E-02

2.247E+00

Std.

1.828E-01

5.375E+00

Min

0.000E+00

0.000E+00

Max

6.243E-01

1.707E+01

Median

0.000E+00

3.122E-01

Jo

4. Experiments and discussion 4.1 Experimental setup

The numerical optimization performance of the proposed HLWCA is evaluated by using IEEE CEC 2017 benchmark problems suite. Then, in order to examine the performance of HLWCA in solving the real-world engineering problems, four widely-used engineering problems are involved. All experiments are implemented using MATLAB R2018a on a PC with an Intel CPU (2.8GHz) and 8GB RAM. In this section, the original WCA [15] and its most frequently used variant Evaporation based Water Cycle Algorithm (ERWCA) [16] are chosen to compare with the proposed HLWCA. Besides, two up-to-date

Journal Pre-proof metaheuristic algorithms, Weighted Differential Evolution (WDE) [33] and Hybrid Firefly and Particle Swarm Optimization (HFPSO) [34], are added for comparison. These four algorithms will be compared comprehensively with HLWCA on both the CEC suite and engineering problems. Besides, another five algorithms are added to further discuss the numerical optimization ability. Since HLWCA does not introduce additional predefined control parameters, the parameter settings will be the same as WCA and ERWCA. The selection of parameters for each algorithm is based on the recommended values in the original paper and the settings of the parameters for all algorithms are summarized in Table 2. Table 2

Algorithm

Literature

pro of

The parameter settings of the algorithms Parameter settings

HLWCA

,

,

In Section 4.2 WCA

[15]

,

,

ERWCA

[16]

,

,

WDE

[33]

,

HFPSO

[34]

,

,

[39]

SSO

[40]

,

GWO

[41]

,

,

WOA

[42]

,

,

SHO

[43]

,

,

,

,

,

,

,

,

,

,

,

,

,

lP

,

,

4.2 Tests on CEC Benchmark suite

,

re-

CMA-ES

,

,

,

In Section 4.7

,

,

,

Jo

urn a

The CEC suite with 10 and 30 variables are involved in this section, and the details of the suite are given in the appendix. Each problem is conducted for 51 independent times by using different initial solutions in each trial. The termination criterion is met when the maximum number of function evaluations reaches . The discrepancy values ( ) between the true optimum and the obtained result in 51 runs are recorded. Then the mean value and standard variance (Std.) for each function are presented. The best result of each problem is highlighted in bold. Results obtained by HLWCA and comparison algorithms are pairwisely compared using two-tailed Wilcoxon signed-rank test with a significance level . If HLWCA is statistically significantly better than another algorithm, a ‘+’ is labeled at the end of the result obtained by the corresponding algorithm. On the contrary, if an algorithm outperforms the proposed HLWCA, a ‘−’ is labeled. If there is no significant difference between HLWCA and the compared algorithm, a ‘=’ is labeled. A summary of statistical comparison is given at the bottom of each table. Table 3 Experimental results of WCA, ERWCA, WDE, HFPSO and HLWCA on 10D suite

F1 F3 F4

WCA

ERWCA

WDE

HFPSO

Mean

3.43E+03

3.40E+03

1.99E+07

2.75E+03

Std.

3.39E+03

3.36E+03

9.65E+06

3.82E+03

Mean

0.00E+00

0.00E+00

3.84E+03

0.00E+00

Std.

0.00E+00

0.00E+00

1.34E+03

0.00E+00

Mean

1.32E-01

6.31E-02

9.69E+00

9.60E-01

Std.

2.77E-01

+ = −

2.43E-02

+ = −

2.63E+00

+ + +

3.37E-01

HLWCA =

9.02E+02 1.01E+03

=

0.00E+00 0.00E+00



1.53E+00 2.44E-01

Journal Pre-proof

F8 F9 F10 F11 F12 F13 F14 F15 F16 F17 F18 F19 F20

1.06E+01

3.84E+00

5.07E+00

Mean

1.61E+00

1.19E+00

7.58E+00

1.36E-01

Std.

2.83E+00

1.99E+00

1.84E+00

3.84E-01

Mean

4.05E+01

4.35E+01

4.09E+01

1.77E+01

Std.

1.38E+01

1.42E+01

6.34E+00

3.88E+00

Mean

2.75E+01

2.53E+01

2.18E+01

9.56E+00

Std.

9.69E+00

Mean

2.72E+00

Std.

1.26E+01

2.94E+01

3.69E+01

Mean

8.60E+02

7.65E+02

5.91E+02

Std.

2.60E+02

2.89E+02

1.27E+02

1.81E+02

Mean

2.33E+01

1.87E+01

2.46E+01

6.89E+00

Std.

2.03E+01

1.37E+01

5.96E+00

3.74E+00

Mean

1.42E+04

1.67E+04

2.48E+05

1.30E+04

Std.

1.40E+04

1.43E+04

1.40E+05

8.98E+03

Mean

3.11E+03

2.38E+03

2.44E+02

4.72E+03

Std.

6.76E+03

Mean

8.28E+01

Std.

3.08E+01

3.40E+01

5.45E+00

2.20E+01

Mean

7.63E+01

8.97E+01

3.89E+01

5.23E+01

Std.

6.37E+01

6.22E+01

Mean

1.16E+02

1.09E+02

Std.

1.21E+02

Mean

5.78E+01

Std.

3.51E+01

2.91E+01

Mean

4.73E+03

7.18E+03

Std.

1.03E+04

Mean

5.78E+01

Std.

4.87E+01

Mean

4.36E+01

Std.

2.24E+01

Mean

1.90E+02

Std.

5.85E+01

+ + + +

2.55E+01

F22 F23 F24 F25

F27 F28 F29 F30

+/=/−

+ + +

+ + + = =

6.07E+00

+ +

8.29E+01

9.82E+01 + −

5.24E+01

1.20E+04

+ + +

4.95E+01

+ + + +

3.88E+00 + + + + +

4.65E+03 +

2.09E+01

8.42E+01

+ +

+

0.00E+00

= +

+ + + −

3.46E+01

4.57E+02

+ +

5.72E+01

9.96E+00

4.07E+01

2.25E+01

1.97E+02

+

3.39E+01

+

4.36E+01

4.23E+00

2.48E+01

8.90E+02

9.69E+03

=

4.89E+02

1.12E+04

1.67E+01

2.05E+02

+

5.48E+00

5.14E+02

3.74E+01

5.43E+01

+

2.58E+01

6.45E+00

5.90E+01

1.73E+02

1.16E+02

1.77E+02

+

1.51E+01

2.34E+01

1.39E+01

1.20E+01

Mean

3.25E+02

3.24E+02

3.14E+02

3.21E+02

Std.

8.75E+00

8.97E+00

3.86E+01

9.18E+00

Mean

3.21E+02

3.30E+02

1.78E+02

3.12E+02

Std.

9.62E+01

Mean

4.27E+02

Std.

2.39E+01

2.26E+01

4.09E+01

3.08E+01

Mean

3.50E+02

3.66E+02

3.60E+02

2.89E+02

Std.

7.60E+01

+

+ + +

7.71E+01

= +

4.32E+02

7.57E+01

− + −

+

3.98E+02

1.02E+02

− +

4.23E+02

5.90E+01

7.38E+01

7.01E+01

3.95E+02

3.99E+02

3.98E+02

Mean

3.94E+02

Std.

3.49E+00

8.25E+00

2.64E+00

1.27E+01

Mean

3.86E+02

4.45E+02

3.61E+02

4.62E+02

Std.

1.11E+02

1.31E+02

5.11E+01

1.44E+02

Mean

3.12E+02

3.26E+02

2.79E+02

2.92E+02

Std.

6.24E+01

Mean

3.34E+05

Std.

4.55E+05

4.55E+05

2.22E+04

4.22E+05

22/5/2

24/4/1

22/3/4

22/4/3

+ = +

+ = +

5.74E+01 +

3.40E+05

+ = +

1.06E+01 +

1.37E+01 1.11E+00

+

3.76E+00 1.45E+00

=

0.00E+00 0.00E+00

2.35E+04

+ +

2.30E+05

1.53E+00 1.37E+00

+

1.63E+04 3.08E+04

+

8.95E+02 1.24E+03

+

2.61E+01 8.67E+00

+

1.97E+01 1.29E+01

+

2.11E+00 2.56E+00

+

1.69E+01 9.37E+00

+

1.05E+03 8.91E+02

+

2.08E+01 3.15E+01

+

1.85E+00 4.69E+00

+

1.50E+02 5.25E+01

+

9.62E+01 1.60E+01

+

3.00E+02 4.29E+01

+

3.24E+02 4.58E+01



4.26E+02 2.34E+01



3.17E+02 3.04E+01

+

3.90E+02 1.17E+00

=

4.30E+02 1.42E+02

+

4.18E+01 +

3.17E+02 1.10E+02

8.83E+01

1.99E+01 =

+

5.46E+01

9.89E+01

+

9.41E+01

=

1.61E+01

Std.

+

0.00E+00 0.00E+00

1.25E+02

4.43E+01

6.09E+01

+

5.28E+03

4.53E+01

+

4.22E+00 1.80E+00

0.00E+00

1.05E+01

+

+

4.08E+00

1.04E+02 +

1.24E+01

Mean

Jo

F26

+

1.10E+01

urn a

F21

9.78E+00

pro of

F7

2.27E+01

Std.

re-

F6

Mean

lP

F5

2.47E+02 8.82E+00

+

7.09E+04 2.21E+05

In Table 3, the mean values of the results obtained by five algorithms on 10 dimensional CEC problems are

Journal Pre-proof

Jo

urn a

lP

re-

pro of

shown. According to table, the proposed HLWCA can find the best average values in 15 out of 29 10D problems. Besides, HLWCA can get the comparative results as other algorithms for F3 and F9. WDE performs the best in 9 out of 29 (F13, F18, F19, F21, F22, F24, F25, F28, and F30). HFPSO behaves better than other algorithms in F12 and F26. Compared to the original WCA and its variant ERWCA, our proposed algorithm greatly improves the average performance (the original WCA gets best result for only F4). Further analysis of the experimental results shows that HLWCA is suitable for solving the 10D unimodal and simple multimodal functions. For the 10D hybrid functions, HLWCA obtains 6/10 best results and is still a competitive algorithm to solve these problems. On the contrary, for the 10D composition functions, WDE has better performance than HLWCA. The two-tailed Wilcoxon signed-rank test results are listed in the end of each column of the compared algorithm. It can be seen that, although some algorithms get better average values, Wilcoxon tests show that our proposed HLWCA actually has equivalent or better statistical results. As Table 3 states, HLWCA performs better than WCA, ERWCA, WDE, HFPSO on 23, 24, 21, and 22 problems respectively. In Fig. 7, the statistical results obtained by the five algorithms are shown in boxplots. Each box contains the median, minimum, maximum and upper and lower quartiles of the 51 independent run results. It should be noticed that the boxplot of WDE on F1 is not given, since the result on F1 is much larger than that of other algorithms. Boxplots indicate that HLWCA has lower standard deviation values on most of the 10D problems, which means it is very robust. From the boxplots, it can be seen that HLWCA achieves much better results than those of the compared algorithms in F3, F16 and F20. For F17 and F29, although the boxplots show that HLWCA is not the most robust algorithm, it can still find the smallest medians of these problems. In the 10D problems, HLWCA can get satisfactory results except F21 and F28. In Table 4, the average values and Std. on 30D CEC suite are given. According to the results, the proposed HLWCA can find the lowest average output values in 24 out of 29 30D problems. Comparing the experimental results in Table 2 and Table 3, it can be seen that the performance of HLWCA for solving 30D problems is obviously better than that of 10D problems. When dealing with 10D problems, HLWCA can solve the unimodal, simple multimodal and hybrid functions effectively, but cannot obtain the best results on the composition functions. When dealing with 30D problems, HLWCA has a good overall performance especially for solving the composition functions. According to the Wilcoxon test results, it can be seen that HLWCA only performs a little worse in F14 and F18. The results of HLWCA and compared algorithms were given in (+/=/−) format: WCA (26/0/3), ERWCA (24/2/3), WDE (26/1/2), HFPSO (24/3/2). In Fig. 8, the boxplots of 30D results are shown. Boxplots indicate that HLWCA get both the lower median and Std. values on most of the 30D problems, except for F4, F10, F14, and F18. In summary, according to the experimental results and boxplots in both 10D and 30D problems, it can be concluded that the performance of HLWCA becomes better as the dimension increases. Fig. 9 records the convergence process of the five algorithms on Function 10 and 30 of the CEC suite respectively. From the convergence curves of Function 10, it can be seen that our algorithm can get the smaller results on 10D while HFPSO outperforms a little on 30D. Unlike WCA and ERWCA, whose convergence curves drop sharply and then remain steady, the curves of HLWCA continue to decline during the whole process. For the Function 30, the WDE performs the best out of five algorithms on 10D. But when dimension increases to 30, the performance of WDE deteriorates dramatically. HLWCA performs quite well on both dimensions.

Jo

urn a

lP

re-

pro of

Journal Pre-proof

Fig. 7a. Boxplots of 10D CEC 2017 problems

lP

re-

pro of

Journal Pre-proof

urn a

Fig. 7b. Boxplots of 10D CEC 2017 problems

Table 4 Experimental results of WCA, ERWCA, WDE, HFPSO and HLWCA on 30D suite

F1 F3

ERWCA

WDE

HFPSO

6.28E+03

5.65E+03

1.41E+08

3.50E+03

Std.

6.06E+03

6.19E+03

5.71E+07

4.13E+03

Mean

0.00E+00

3.92E-07

5.10E+04

0.00E+00

Std.

0.00E+00

Mean

5.57E+01

Std.

2.64E+01

2.98E+01

1.98E+01

1.56E+01

Mean

1.41E+02

1.48E+02

1.33E+02

8.37E+01

Std.

4.01E+01

4.21E+01

1.48E+01

2.62E+01

Mean

3.05E+01

2.80E+01

2.09E+01

8.61E-01

Std.

1.31E+01

9.10E+00

2.86E+00

1.14E+00

Mean

2.70E+02

2.62E+02

1.72E+02

9.88E+01

Std.

6.41E+01

6.44E+01

1.90E+01

1.83E+01

Mean

1.50E+02

1.52E+02

1.23E+02

8.17E+01

Std.

4.23E+01

3.74E+01

1.47E+01

2.29E+01

Mean

3.79E+03

4.10E+03

2.34E+03

6.64E+01

Std.

1.36E+03

Jo

F4

WCA

Mean

F5 F6 F7 F8 F9

+ =

= =

2.80E-06

= + + + + +

4.98E+01

2.08E+03

+ +

6.82E+03 = + + + + +

2.26E+02

4.32E+02

HLWCA =

2.47E+03 =

+ + + + +

7.47E+01

3.48E+02

0.00E+00 0.00E+00

0.00E+00 +

2.41E+03

+

4.63E+01 3.14E+01

+

3.56E+01 7.30E+00

+

6.75E-03 1.37E-02

+

6.65E+01 9.19E+00

+

3.16E+01 7.53E+00

=

2.36E+01 3.95E+01

Journal Pre-proof

F13 F14 F15 F16 F17 F18 F19 F20 F21 F22 F23 F24 F25

6.32E+02

2.87E+02

4.94E+02

Mean

2.46E+02

2.29E+02

2.46E+02

1.05E+02

Std.

8.40E+01

6.17E+01

3.00E+01

4.53E+01

Mean

1.20E+05

1.96E+05

1.15E+07

1.35E+05

Std.

1.64E+05

7.43E+05

3.29E+06

1.58E+05

Mean

2.69E+04

2.62E+04

2.28E+05

2.22E+04

Std.

2.34E+04

Mean

3.04E+03

Std.

3.73E+03

4.19E+03

7.02E+02

Mean

1.49E+04

1.66E+04

6.10E+03

Std.

1.37E+04

1.59E+04

2.53E+03

1.42E+04

Mean

1.00E+03

1.09E+03

8.39E+02

7.09E+02

Std.

2.99E+02

3.14E+02

1.28E+02

2.20E+02

Mean

5.33E+02

5.59E+02

2.01E+02

2.44E+02

Std.

2.03E+02

2.24E+02

4.24E+01

1.28E+02

Mean

4.24E+04

3.51E+04

6.91E+04

1.11E+05

Std.

3.70E+04

Mean

1.29E+04

Std.

1.62E+04

1.63E+04

5.28E+03

1.16E+04

Mean

4.95E+02

5.10E+02

2.67E+02

3.17E+02

Std.

1.63E+02

1.84E+02

Mean

3.38E+02

3.42E+02

Std.

3.83E+01

Mean

2.84E+03

Std.

1.96E+03

2.15E+03

Mean

5.01E+02

5.03E+02

Std.

4.49E+01

Mean

5.58E+02

Std.

3.95E+01

Mean

3.93E+02

Std.

1.24E+01

Mean

2.84E+03

Std.

8.06E+02

+ + + +

3.96E+03

F27 F28 F29 F30

+ + +

= + + + −

3.47E+03

+ +

1.26E+04

4.60E+01 + +

2.67E+03

4.47E+01

+ = +

5.57E+02

+ + + +

1.06E+05 = + + + −

1.29E+03

+ +

2.68E+03

+



5.65E+03

+ +

+ + + =

8.79E+03

1.37E+04

+ +

8.84E+03

6.54E+01

1.39E+02

3.16E+02

2.77E+02

+

7.02E+02

+

1.27E+03

1.26E+02

1.64E+03

4.96E+02

4.90E+02

+

2.17E+01

4.05E+01

5.93E+02

5.70E+02

+

1.65E+01

5.33E+01

4.75E+02

3.93E+02

+

1.66E+01

1.21E+01

1.50E+01

2.74E+03

1.44E+03

1.49E+03

+

2.45E+01

2.32E+01

9.39E+00

2.02E+01

Mean

3.77E+02

3.92E+02

5.82E+02

3.68E+02

Std.

6.30E+01

6.17E+01

2.29E+01

5.98E+01

Mean

1.13E+03

1.12E+03

8.66E+02

7.18E+02

Std.

2.31E+02

Mean

9.29E+03

Std.

5.82E+03

1.74E+04

6.99E+04

1.18E+04

24/4/1

23/5/1

26/2/1

22/6/1

+

+ + +

2.41E+02

+

1.19E+04

5.68E+02

+ + +

9.55E+01 +

+

2.56E+04 1.11E+04

=

1.16E+04 9.50E+03

+

3.13E+03 3.09E+03

1.81E+05

+

5.33E+02

+

1.02E+04

3.51E+02 1.84E+02

+

9.04E+01 5.18E+01

=

8.75E+04 4.44E+04

+

4.09E+03 4.17E+03

+

1.60E+02 6.66E+01

+

2.34E+02 7.14E+00

+

2.28E+02 6.40E+02

+

3.83E+02 9.34E+00

+

4.49E+02 6.84E+00

=

3.92E+02 1.27E+01

+

1.44E+03 2.37E+02

+

5.15E+02 9.53E+00

+

3.29E+02 5.18E+01

+

1.58E+02 +

1.86E+03 2.75E+03

8.62E+02

5.43E+02

+

5.41E+02

=

1.37E+02

Std.

+

6.33E+01 3.32E+01

2.15E+01

3.99E+02

8.85E+02

+

7.62E+04

4.68E+01

=

2.92E+03 3.60E+02

4.99E+03

3.09E+01

+



2.25E+04

2.11E+04

5.11E+04 +

4.09E+03

Mean

Jo

+/=/−

+

2.22E+04

urn a

F26

6.82E+02

pro of

F12

4.04E+03

Std.

re-

F11

Mean

lP

F10

5.79E+02 8.80E+01

+

4.21E+03 1.79E+03

Jo

urn a

lP

re-

pro of

Journal Pre-proof

Fig. 8a. Boxplots of 30D CEC 2017 problems

lP

re-

pro of

Journal Pre-proof

Jo

urn a

Fig. 8b. Boxplots of 30D CEC 2017 problems

re-

pro of

Journal Pre-proof

Fig. 9. Convergence curve on part of CEC 2017 functions

lP

4.3 Tests on engineering and mechanical optimization problems

urn a

In order to verify the performance of the proposed algorithm in dealing with realistic optimizations, four widely-used engineering and mechanical design benchmark problems are tested. Details of these problems are shown in appendix. The parameter settings are the same as listed in Table 2, the termination criterion is the maximum number of function evaluation ( ). The maximum, minimum, average values and the standard deviation are used as evaluation criteria to estimate the performance. Since most engineering and mechanical cases include constrains, it is necessary to find a suitable way to transfer the constrained problem into an unconstrained one. A general constrained minimization problem can be given as follows:

min s.t.

f  X

G  X  0

(21)

where

Jo

In this paper, the constraints are handled by adding the penalty to the objective function. After adding the penalty, the original constrained problem can be transformed into:

F  X  f  X  

is the original objective function and

(22)

is the penalty form of constraints. m

  p   max  gi  X  , 0 

(23)

i 1

where is a large constant number, the constraint function matrix.

is the total number of constraint functions,

is

Journal Pre-proof Table 5 Experimental results for engineering problems

Welded beam design

Std.

WCA

3003.8049

2995.4206

2994.4711

2.80E+00

ERWCA

3007.4366

2995.2759

2994.4711

2.82E+00

WDE

3019.1821

3009.8863

3002.7228

3.56E+00

HFPSO

2994.4731

2994.4712

2994.4711

2.82E-04

HLWCA

2994.4712

2994.4711

2994.4711

1.40E-05

WCA

7319.0016

6359.8689

5885.3334

4.90E+02

ERWCA

7319.0027

6353.1118

5885.3329

4.64E+02

WDE

9446.6607

7514.0315

6425.4416

6.96E+02

HFPSO

7317.1254

6255.0019

5885.7745

3.51E+02

HLWCA

6872.5138

6084.6065

5885.3328

2.23E+02

WCA

0.0177732

0.0135614

0.0126653

1.40E-03

ERWCA

0.0148135

0.0130375

0.0126653

5.18E-04

WDE

0.0142455

0.0131668

0.0127186

3.32E-04

HFPSO

0.0144124

0.0129937

0.012678

3.20E-04

HLWCA

0.013314

0.0127525

0.0126741

1.23E-04

WCA

2.1965246

1.7532607

1.7248539

8.97E-02

ERWCA

2.0785743

1.7398782

1.7248609

5.27E-02

WDE

2.0509218

1.8791529

1.7524406

6.27E-02

HFPSO

1.7261287

1.7251208

1.7249118

2.71E-04

HLWCA

1.9432522

1.7332474

1.7248679

3.52E-02

pro of

Tension /compression spring design

Min

re-

Vessel design

Ave

lP

Speed reducer design

Max

Jo

urn a

The first design problem is the speed reducer design problem [35]. From Table 5, it can be seen that except the WDE, other algorithms can obtain similar results, which are better than those of WDE. But HLWCA can get better overall performance. Besides, the robustness (measured by standard deviation) of our algorithm improves largely, compared to the original WCA (1.40E−0.5 vs. 2.80E+0.0). The second problem is the vessel design problem suggested by Kannan and Kramer [36]. In this problem, our algorithm behaves remarkably and obtains the best results on four evaluation criteria after executing the same 20000 function evaluations. The tension/compression spring problem, which is proposed by Arora [37], is the third problem. As can be seen from the summarized results, HLWCA performs well in three criteria except the minimum value. The average result of HLWCA improves nearly 2% compared to HFPSO. The welded beam design problem proposed by Coello [38] is the last one. It can be seen that HFPSO performs best in this case, achieving all the best except the minimum. But we should also notice that our algorithm is the second place of all the algorithms. 4.4 Computational complexity

The computational complexity of the proposed HLWCA mainly stems from two parts: the first is the solution sorting of collection and optima searching within the rest collections; the second is the position updating of all solutions. As mentioned above, a distinctive character of the proposed HLWCA is that it divides the whole solutions into i+1 collections. Only the solutions in collection need to be sorted in each iteration, and the other solutions merely need to find a best solution within each collection rather than obtain a sorted sequence. For the solutions

Journal Pre-proof in , the sorting behavior is a sequence sorting process, so the worst computational complexity to sort the solutions in is:

pro of

where is the solution number of collection . For the solutions in the rest collections, they only need to find the best one within each collection. Therefore, the total number of necessary comparison in each collection is:

where m is the sum number of solutions. So that, that total computational complexity of the rest collections is: Considering the worst case, the computational complexity of the first part should be:

The position updating of solutions in HLWCA is same as the learning mechanisms in most metaheuristic algorithms. The computational complexity of the position updating process is:

re-

where d is the dimension of optimization problem. 4.5 Time complexity

According to the guidelines of CEC benchmark suite, the computational time complexity of WCA and HLWCA is presented. The parameters denotes the running time of the following program:

lP

x=0.55; for i=1:1000000

x=x+x; x=x/2; x=x*x; x=sqrt(x); x=log(x); x=exp(x); x=x/(x+2);

urn a

end

is the average time to compute Function 18 in benchmark suite for 200000 times evaluation with a certain dimension only. represents the time for the algorithm to completely execute 200000 evaluations of the same dimensional Function 18. is evaluated for five times, and the mean value is denoted as the algorithm is estimated by . Table 6

. The complexity of

Time complexity for WCA and HLWCA

Jo

(/s)

10D 30D 50D

0.080

(/s)

(/s)

WCA

HLWCA

WCA

HLWCA

WCA

HLWCA

0.296

0.100

2.957

0.806

33.268

8.825

0.671

0.467

3.472

1.559

35.018

13.650

1.276

1.089

4.151

2.472

35.938

17.288

Journal Pre-proof WCA HLWCA

Algorithm complexity 40

30

20

pro of

10

0 10D

30D

50D

Fig. 10. Algorithm complexity of WCA and HLWCA

The time complexities of WCA and HLWCA are illustrated in Table 6. The parameter depicts the basic computation capability, depicts the time for computing the function while depicts the total time for executing the optimization. Through the difference of and , the time of the searching process can be depicted. The ratio of illustrates the efficiency of the searching process, the smaller the ratio the higher the

lP

re-

searching efficiency. From Table 6, it can be seen that the computing time of WCA is 3.67 times of HLWCA’s for 10 dimensional case. WCA’s is 122.7% more and 67.9% more than that of HLWCA in case of 30 and 50 dimension respectively. Besides, from the Fig. 10, in which the ratio of is plotted, it can be seen that although the discrepancy of algorithm complexity is shrinking as the dimension increasing, the time complexity of WCA is obviously higher than that of HLWCA. The reason is that WCA utilizes a real-time and serial operation which means WCA will update each solution and compare with other solutions orderly. On the contrary, HLWCA updates the position and function value one collection by another, and only need compare the best solution in a collection with the exemplar solution. Therefore, the number of comparisons is reduced and HLWCA has a lower time complexity than WCA. 4.6 Further comparison analysis and discussion

Jo

urn a

In this section, we further compare HLWCA with some other widely-used metaheuristics algorithm on the CEC benchmark suite. These algorithms include: the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) [39], Social Spider Optimizer (SSO) [40], Grey Wolf Optimizer (GWO) [41], Whale Optimization Algorithm (WOA) [42] and Selfish Herd Optimizer (SHO) [43]. The parameter settings are the same as the corresponding papers and the detail of parameters of each algorithm is summarized in Table 2. Same as section 4.2, the discrepancy values are recorded and the better results are marked in bold. As can be seen from the results in Table 7, our proposed algorithm still perform well on both 10D and 30D problems. Among them, HLWCA expresses an overwhelming advantage over GWO and WOA. For SHO, HLWCA also has very better performance of 26/3/0 and 25/4/0 on 10D and 30D, respectively. From Table 7 and Table 8, we can conclude that the advantage of CMA-ES is mainly from dealing with the hybrid functions, especially when the dimension increases to 30. Similarly, SSO gets better results on composition functions. But the performance of HLWCA is relatively balanced, which gets a slight advantage over both two algorithms. Table 7 Experimental results of CMA-ES, SSO, GWO, WOA, SHO and HLWCA on 10D suite

F1 F3 F4

CMA-ES

SSO

GWO

WOA

SHO

Mean

0.00E+00

2.59E+03

1.06E+07

2.64E+05

2.75E+03

Std.

0.00E+00

3.22E+03

5.31E+07

5.27E+05

2.98E+03

Mean

0.00E+00

5.05E+03

6.57E+02

2.97E+02

1.45E-03

Std.

0.00E+00

Mean

0.00E+00

− =

+ +

4.92E+03 −

3.20E+00

+ +

1.18E+03 +

1.05E+01

+ +

3.03E+02 +

2.12E+01

HLWCA +

1.01E+03 +

1.33E-03 +

3.27E+00

9.02E+02 0.00E+00 0.00E+00

+

1.53E+00

Journal Pre-proof

F9 F10 F11 F12 F13 F14 F15 F16 F17 F18 F19 F20 F21

3.71E+01

1.47E+00

1.31E+01

4.56E+01

2.57E+01

Std.

1.18E+01

1.74E+00

7.25E+00

1.89E+01

7.69E+00

Mean

1.36E+00

1.47E-01

5.63E-01

2.81E+01

6.68E+00

Std.

3.80E+00

5.58E-02

7.79E-01

1.26E+01

5.13E+00

Mean

3.06E+01

1.86E+01

2.70E+01

7.18E+01

3.72E+01

Std.

2.34E+01

2.62E+00

9.66E+00

2.53E+01

1.11E+01

Mean

1.53E+01

8.55E+00

1.10E+01

3.87E+01

1.93E+01

Std.

8.93E+00

1.71E+00

4.87E+00

1.42E+01

7.80E+00

Mean

3.93E-01

3.64E+00

2.65E+00

3.75E+02

3.76E+01

Std.

9.49E-01

1.95E+00

6.45E+00

Mean

8.82E+02

3.43E+02

4.32E+02

Std.

3.87E+02

Mean

7.84E+01

Std.

4.09E+01

3.63E+00

2.09E+01

Mean

5.22E+02

2.53E+04

4.32E+05

Std.

2.26E+02

7.25E+04

6.53E+05

Mean

4.28E+02

2.58E+03

9.61E+03

Std.

2.64E+02

3.97E+03

6.15E+03

Mean

1.16E+02

1.36E+03

8.66E+02

Std.

1.16E+02

2.61E+03

1.52E+03

Mean

6.94E+02

1.37E+03

1.60E+03

Std.

4.32E+03

2.90E+03

1.73E+03

2.21E+03

9.27E+02

Mean

2.87E+02

6.54E+00

7.90E+01

2.09E+02

2.22E+02

Std.

1.95E+02

7.10E+00

1.01E+02

1.19E+02

1.28E+02

Mean

1.03E+02

9.91E+00

4.34E+01

8.34E+01

6.00E+01

Std.

7.80E+01

4.57E+00

1.41E+01

3.99E+01

3.63E+01

Mean

9.99E+01

7.24E+03

2.46E+04

1.54E+04

7.09E+03

Std.

7.21E+01

Mean

3.65E+01

Std.

2.80E+01

3.87E+03

5.41E+03

5.09E+04

4.06E+03

Mean

2.66E+02

4.42E+00

6.36E+01

1.48E+02

7.35E+01

Std.

1.61E+02

2.05E+00

5.36E+01

6.46E+01

5.26E+01

Mean

2.20E+02

1.12E+02

1.97E+02

2.12E+02

1.97E+02

Std. F22

Mean Std.

F23

Mean Std.

F24

Mean Std.

F25

Mean Std.

F27 F28 F29 F30

+/=/−

+ + + + +

+ + + + + +

8.57E+01 + − − + + + + −

7.78E+00

+ +

2.57E+03

+ + + + +

2.33E+02 + = = + + + = +

6.76E+03 +

+

2.47E+01

+ =

4.34E+03

+ + + +

2.42E+02

7.45E+01

9.95E+02

8.31E+02

+

3.03E+02

+ + + + + + + +

1.57E+04

+

+

9.94E+01

+

+ +

3.16E+01

8.33E+01

1.77E+01

3.43E+06

1.30E+04

+

4.34E+06

1.23E+04

1.58E+04

5.89E+03

+

1.42E+04

5.99E+03

2.05E+02

5.28E+02

+

4.73E+02

6.49E+02

2.21E+03

6.76E+02

+ + + +

1.24E+04 +

1.97E+04

+ + +

2.78E+03

1.48E+01

4.17E+01

5.66E+01

5.78E+01

6.43E+01

1.05E+02

1.43E+02

1.07E+02



+

+

6.24E+02

2.92E+01

1.65E+01

2.18E+02

1.13E+01

3.21E+02

2.99E+02

3.16E+02

3.44E+02

3.43E+02

+

+

+

+

2.05E+01

6.88E+01

8.55E+00

1.78E+01

1.71E+01

3.47E+02

1.45E+02

3.34E+02

3.60E+02

3.02E+02

+



+

+

7.67E+00

3.16E+01

4.54E+01

6.80E+01

1.18E+02

4.34E+02

3.09E+02

4.36E+02

4.26E+02

4.30E+02

=



=

+

2.02E+01

9.18E+01

1.55E+01

6.10E+01

2.26E+01

5.75E+02

1.82E+02

3.58E+02

6.05E+02

3.59E+02

+



+

1.82E+02

+

Std.

4.46E+02

Mean

4.16E+02

Std.

2.98E+01

1.36E+00

8.63E+00

3.14E+01

1.60E+01

Mean

6.69E+02

4.20E+02

5.33E+02

5.17E+02

4.70E+02

Std.

1.25E+02

5.30E+01

1.05E+02

1.60E+02

1.52E+02

Mean

3.58E+02

2.70E+02

2.78E+02

4.27E+02

3.37E+02

Std.

9.90E+01

1.90E+01

2.94E+01

8.89E+01

7.04E+01

Mean

1.70E+05

2.94E+03

5.41E+05

4.85E+05

1.63E+05

Std.

3.80E+05

3.56E+03

7.38E+05

6.39E+05

3.66E+05

20/3/6

18/5/6

28/1/0

29/0/0

26/3/0

1.13E+02 + + + −

3.76E+02

− = + −

3.95E+02

3.75E+02 + + + +

4.22E+00 1.80E+00

+

0.00E+00 0.00E+00

+

1.37E+01 1.11E+00

+

3.76E+00 1.45E+00

+

0.00E+00

4.22E+02

0.00E+00

+ +

+ + +

4.02E+02

1.53E+00 1.37E+00

=

1.63E+04 3.08E+04

+

8.95E+02 1.24E+03

+

2.61E+01 8.67E+00

+

1.97E+01 1.29E+01

+

2.11E+00 2.56E+00

+

1.69E+01 9.37E+00

+

1.05E+03 8.91E+02

+

2.08E+01 3.15E+01

+

1.85E+00 4.69E+00

+

1.50E+02 5.25E+01

+

9.62E+01 1.60E+01

+

3.00E+02 4.29E+01

+

3.24E+02 4.58E+01

=

4.26E+02 2.34E+01

+

1.12E+02 +

3.17E+02 1.10E+02

6.31E+03

4.89E+02

=

+

3.06E+02

7.05E+00

Jo

F26

Mean

+

2.44E-01

pro of

F8

1.14E+01

8.68E+00

re-

F7

1.77E+00

1.65E+01

lP

F6

0.00E+00

Mean

urn a

F5

Std.

3.17E+02 3.04E+01

+

3.90E+02 1.17E+00

=

4.30E+02 1.42E+02

+

2.47E+02 8.82E+00

+

7.09E+04 2.21E+05

Journal Pre-proof

Table 8 Experimental results of CMA-ES, SSO, GWO, WOA, SHO and HLWCA on 30D suite

F6 F7 F8 F9 F10 F11 F12 F13 F14 F15 F16 F17

SHO

1.92E+06

3.65E+03

Std.

0.00E+00

1.03E+02

7.76E+08

1.28E+06

5.27E+03

Mean

8.10E+03

8.00E+04

2.85E+04

1.51E+05

7.56E+02

Std.

4.24E+04

2.79E+04

7.04E+03

5.99E+04

7.52E+02

Mean

5.35E+01

6.10E+01

1.53E+02

1.50E+02

7.98E+01

Std.

1.92E+01

1.76E+01

5.85E+01

Mean

5.62E+01

7.94E+01

8.40E+01

Std.

1.58E+01

9.68E+00

2.89E+01

Mean

3.70E+00

1.37E-03

4.48E+00

Std.

3.46E+00

3.59E-04

2.76E+00

Mean

9.84E+01

1.09E+02

1.34E+02

Std.

7.08E+01

7.67E+00

3.55E+01

Mean

5.71E+01

8.25E+01

7.17E+01

Std.

1.56E+01

9.55E+00

2.52E+01

Mean

1.41E+00

1.34E+03

4.67E+02

Std.

2.04E+00

Mean

3.95E+03

Std.

1.20E+03

2.54E+02

7.97E+02

Mean

1.72E+02

7.22E+01

3.36E+02

Std.

6.23E+01

1.82E+01

3.68E+02

9.34E+01

3.45E+01

Mean

1.46E+03

1.14E+06

4.67E+07

4.11E+07

3.21E+05

Std.

3.54E+02

4.62E+05

5.89E+07

2.91E+07

1.55E+05

Mean

2.28E+03

8.09E+03

1.01E+06

1.47E+05

2.23E+04

Std.

1.06E+03

4.33E+03

4.50E+06

1.18E+05

1.23E+04

Mean

2.03E+02

1.70E+04

1.56E+05

7.57E+05

6.86E+03

Std.

5.85E+01

2.22E+04

2.71E+05

8.17E+05

5.50E+03

Mean

3.08E+02

9.22E+02

1.66E+05

7.05E+04

2.28E+03

Std.

1.38E+02

9.63E+02

5.46E+05

4.30E+04

2.69E+03

Mean

7.32E+02

5.12E+02

7.26E+02

1.84E+03

1.25E+03

Std.

2.30E+02

1.20E+02

1.95E+02

4.13E+02

2.85E+02

3.61E+02

1.29E+02

2.60E+02

8.15E+02

5.89E+02

Mean Std.

F18

Mean Std.

F19

Mean Std.

F20

Mean Std.

F21

Mean

F22 F23 F24 F25 F26

= = + + + + −

− + + + = + + +

5.19E+02 + + − − − − + +

1.95E+02

2.30E+02

3.70E+03

1.56E+05

+ +

+ = + =

2.80E+03

+ = + + +

+ + + + +

+ +

2.66E+02

= + + + + + + +

+

1.69E+02

5.91E+01

3.03E+01

6.52E+01

4.26E+01

+

8.63E+00

6.39E+00

5.27E+02

3.07E+02

+

9.62E+01

5.21E+01

2.07E+02

1.18E+02

+

4.85E+01

1.97E+01

7.28E+03

2.82E+03

+

5.17E+03

+

4.03E+03

7.85E+02

7.36E+02

3.55E+02

1.14E+02

+ + + + + + +

2.61E+06

+

7.92E+04

3.93E+05

2.59E+06

4.91E+04

2.16E+02

9.71E+02

2.01E+06

2.49E+06

4.37E+03



+

+

8.85E+01

2.10E+03

6.51E+06

1.84E+06

3.70E+03

6.89E+02

1.79E+02

3.09E+02

7.24E+02

5.82E+02

+

=

+

+

2.91E+02

4.93E+01

1.27E+02

1.69E+02

1.97E+02

2.62E+02

2.61E+02

2.77E+02

4.48E+02

3.60E+02

+

+

+

+

1.41E+01

4.53E+01

2.83E+01

6.68E+01

2.96E+01

Mean

3.70E+03

2.18E+02

2.05E+03

4.24E+03

4.21E+02

Std.

6.95E+02

3.19E+01

1.51E+03

2.04E+03

1.12E+03

Mean

4.08E+02

4.41E+02

4.36E+02

7.47E+02

7.14E+02

Std.

1.52E+01

1.11E+01

3.49E+01

9.53E+01

9.32E+01

Mean

4.74E+02

5.45E+02

5.17E+02

7.64E+02

7.61E+02

Std.

1.68E+01

6.60E+01

5.57E+01

8.06E+01

1.01E+02

Mean

3.87E+02

3.79E+02

4.57E+02

4.53E+02

4.11E+02

Std.

7.04E-01

Mean

1.59E+03

Std.

2.24E+02

+ + + −

+ + + −

1.17E+03 2.47E+02

+ + +

2.12E+01

3.10E-01 +

+



1.84E+03 3.46E+02

+ + + +

3.31E+01 +

+

0.00E+00 0.00E+00

+

4.63E+01 3.14E+01

4.69E+03 1.16E+03

+ +

3.47E+03 2.09E+03

6.75E-03 1.37E-02

+

6.65E+01 9.19E+00

+

3.16E+01 7.53E+00

+

2.36E+01 3.95E+01

+

2.92E+03 3.60E+02

+

6.33E+01 3.32E+01

+

2.56E+04 1.11E+04

+

1.16E+04 9.50E+03

+

3.13E+03 3.09E+03

=

1.86E+03 2.75E+03

+

3.51E+02 1.84E+02

+

9.04E+01 5.18E+01

=

8.75E+04 4.44E+04

=

4.09E+03 4.17E+03

+

1.60E+02 6.66E+01

+

2.34E+02 7.14E+00

+

2.28E+02 6.40E+02

+

3.83E+02 9.34E+00

+

4.49E+02 6.84E+00

+

2.26E+01 +

3.56E+01 7.30E+00

2.67E+02

8.17E+04



2.41E+03 2.47E+03

6.16E+02

2.67E+02 +

=

2.30E+01

2.15E+03

1.39E+02 4.94E+05

+

4.89E+01

3.98E+02

6.07E+01



+

8.77E+01

Jo

Std.



HLWCA

pro of

F5

WOA

9.62E+08

re-

F4

GWO

8.07E+01

lP

F3

SSO

0.00E+00

urn a

F1

CMA-ES Mean

3.92E+02 1.27E+01

+

1.44E+03 2.37E+02

Journal Pre-proof

F28 F29 F30

Mean

5.39E+02

Std.

1.68E+01

1.01E-04

1.62E+01

6.69E+01

1.08E+02

Mean

3.60E+02

4.94E+02

5.35E+02

5.04E+02

3.98E+02

Std.

6.68E+01

4.69E+00

5.71E+01

2.91E+01

2.79E+01

Mean

9.51E+02

4.42E+02

7.85E+02

1.87E+03

1.24E+03

Std.

1.95E+02

7.40E+01

1.55E+02

3.87E+02

2.83E+02

Mean

2.51E+03

3.09E+02

4.16E+06

9.77E+06

5.52E+03

Std.

2.17E+02

3.93E+01

4.67E+06

7.52E+06

1.60E+03

17/2/10

17/5/7

28/1/0

29/0/0

25/4/0

+/=/−

+ + + −

5.00E+02

− + − −

5.34E+02

+ + + +

6.51E+02

+ + + +

6.81E+02

+

5.15E+02 9.53E+00

+

3.29E+02 5.18E+01

+

5.79E+02 8.80E+01

+

4.21E+03 1.79E+03

pro of

F27

5. Conclusion

lP

re-

In this work, a Hierarchical Learning Water Cycle Algorithm (HLWCA) is proposed. In HLWCA, solutions are divided into several collections and one of the collections has a higher level than others. The solutions in the higher level collection use an updating mechanism with exploration tendency while other solutions are more inclined to exploit the searched area. HLWCA enhances the global searching ability while retains the advantage of fast convergence and strong local searching ability. The performance of the proposed algorithm is examined on the benchmark suite of IEEE CEC 2017. Furthermore, four well-known engineering and mechanical design problems are employed to investigate the capability of handling constrained optimization problems. The obtained results show that our modifications on original WCA enhance the performance on complex problems, reduce the computational time. It is also proved that the proposed HLWCA exhibits a competitive and good performance in solving real world optimization problems.

Acknowledgment

Appendix CEC 2017 suite

urn a

This work is supported by the National Natural Science Foundation of China (Grant No. 51875466 and 51805436).

Jo

The IEEE CEC 2017 Benchmark suite consists of 29 unconstrained continuous optimization problems with varying difficulty levels (the original suite contains 30 problems, but in the latest version, Function 2 has been excluded because of its unstable behavior). In the suite, Functions 1 and 3 are unimodal, Functions 4 to 10 are simple multimodal, Functions 11 to 20 are hybrid functions, and Functions 21 to 30 are composition functions. All functions are shifted and rotated, so that their characteristics become more complicated than basic numerical benchmark problems and they can be much closer to the real-world optimization problems. The search range for each function is where D is the dimension of the problem, is the true global optimum value of each function. Summary of CEC 2017 expensive benchmark problems No.

Description

Fi*

Unimodal

1

Shifted and Rotated Bent Cigar Function

100

Functions

2

Shifted and Rotated Sum of Different Power Function*

200

3

Shifted and Rotated Zakharov Function

300

Journal Pre-proof 4

Shifted and Rotated Rosenbrock’s Function

400

Multimodal

5

Shifted and Rotated Rastrigin’s Function

500

Functions

6

Shifted and Rotated Expanded Scaffer’s F6 Function

600

7

Shifted and Rotated Lunacek Bi_Rastrigin Function

700

8

Shifted and Rotated Non-Continuous Rastrigin’s Function

800

9

Shifted and Rotated Levy Function

900

10

Shifted and Rotated Schwefel’s Function

1000

Hybrid

11

Hybrid Function 1 (N=3)

1100

Functions

12

Hybrid Function 2 (N=3)

1200

13

Hybrid Function 3 (N=3)

14

Hybrid Function 4 (N=4)

15

Hybrid Function 5 (N=4)

16

Hybrid Function 6 (N=4)

17

Hybrid Function 6 (N=5)

18

Hybrid Function 6 (N=5)

19

Hybrid Function 6 (N=5)

20

Hybrid Function 6 (N=6)

Composition

21

Composition Function 1 (N=3)

Functions

22

Composition Function 2 (N=3)

2200

23

Composition Function 3 (N=4)

2300

24

Composition Function 4 (N=4)

2400

25

Composition Function 5 (N=5)

2500

26

Composition Function 6 (N=5)

2600

27

Composition Function 7 (N=6)

2700

28

Composition Function 8 (N=6)

2800

29

Composition Function 9 (N=3)

2900

30

Composition Function 10 (N=3)

3000

lP

re-

pro of

Simple

1300 1400 1500 1600 1700 1800 1900 2000 2100

urn a

* F2 has been excluded because it shows unstable behavior especially for higher dimensions, and significant performance variations for the same algorithm implemented in MATLAB, C

Speed reducer problem

Jo

The goal of speed reducer design problem is to minimize the total weight of two-stage speed reducer. The constraints include the bending stress of the gear teeth, surface stress, transverse deflections of the shafts, and stresses in the shafts. Seven variables are considered in the problem, they are: the face width ( ), the module of teeth ( ), the number of teeth in the pinion ( ), the length of the first shaft between bearings ( ), the length of the second shaft between bearings ( ), the diameter of the first shaft ( ), and the diameter of the second shaft ( ). The target function is constrained by eleven inequalities. The mathematical model is defined as follows.

Subject to

,

,

,

,

,

,

lP

where,

re-

pro of

Journal Pre-proof

Pressure vessel problem

where,

Jo

Subject to

urn a

The goal of vessel design problem is to minimize the total cost which includes the cost of material, forming and welding. Four variables are considered in the problem, they are: the thickness of the shell ( ), the thickness of the head ( ), the inner radius ( ), and the length of the cylindrical section of the vessel ( ). The target function is constrained by four inequalities. The mathematical model is defined as follows.

and

Tension/compression spring problem The goal of tension/compression spring problem is to minimize the weight of the spring while subjecting to the constraints of deflection, shear stress, surge frequency and outsider diameter. Three independent variables are considered in this problem: the wire diameter ( ), the mean coil ( ), and the number of active coils ( ). The target function is constrained by four inequalities. The mathematical model is defined as follows.

Journal Pre-proof

where,

,

pro of

Subject to

,

Welded beam problem

re-

The goal of welded beam problem is to minimize the total cost subject to constraints on shear stress, bending stress, buckling load, end deflection and side displacement. There are four variables which need to be optimized: ( ), ( ), ( ), ( ). The target function is constrained by seven inequalities. The mathematical model is defined as follows.

urn a

and

Jo

where,

lP

Subject to

Journal Pre-proof

The above material properties and constraint values are given as follows: , , , ,

,

pro of

References

,

[1] S.Binitha, S.S. Sathya, A survey of bio inspired optimization algorithms, Int. J. Soft Comput. Eng. 2 (2) (2012) 137-151. [2] R. Rajakumar, P. Dhavachelvan, T. Vengattaraman. A survey on nature inspired meta-heuristic algorithms with its domain specifications. 2016 International Conference on Communication and Electronics Systems (ICCES).

[3] K. Hussain, M.N.M. Salleh, S. Cheng, et al. Metaheuristic research: a comprehensive survey. Artificial Intelligence Review, 2018(5):1-43.

[4] F. Fausto, A. Reyna-Orta, E. Cuevas, et al. From ants to whales: metaheuristics for all tastes. Artificial Intelligence Review, 2019.

re-

[5] P. Agarwal, S. Methta, Nature-inspired algorithms: state-of-art, problems and prospects, Int. J. Comput. Appl. 100 (2014) 14-21. [6] L. kang, Y. Chen, Evolutionary computation, Journal of Unmerical Methods & Applications. 5 (1) (1995) 401-423. [7] J.H. Holland, Genetic algorithms, Sci Am. 267 66-72.

[8] D. Goldberg. Genetic algorithms in search, optimization and machine learning. Reading, MA Publisher: Addison-Wesley, 1989. [9] S. Das, P.N. Suganthan, Differential Evolution: A Survey of the State-of-the-Art, IEEE Trans. Evolut. Comput. 15 (1) (2011)

lP

4-31.

[10] D. Karaboga, B. Basturk. A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. J. Global Optimization, 2007, 39.

[11] D. Karaboga, B. Basturk. On the performance of artificial bee colony (ABC) algorithm. Applied Soft Computing Journal, 2008, 8(1):687-697.

urn a

[12] J. Kennedy, R. Eberhart, Particle swarm optimization, Proceedings of IEEE International Conference on Neural Networks, vol. 4, IEEE, 1995, 1942-1948.

[13] S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, et al., Optimization by simulated annealing, Science 220 (4589) (1983) 671-680. [14] Z.W.Geem, J.H. Kim, G. Loganathan, A new heuristic optimization algorithm: harmony search, Simulation 76 (2) (2001) 60-68. [15] H. Eskandar, A. Sadollah, A. Bahreinineja, M. Hamdi, Water cycle algorithm – a novel metaheuristic optimization method for solving constrained engineering optimization problems, Comput. Struct. 110-111 (2012) 151-166. [16] A. Sadollah, H. Eskandar, A. Bahreininejad, J.H. Kim, Water cycle algorithm with evaporation rate for solving constrained and unconstrained optimization problems. Appl. Soft Comput. 30 (C) (2015) 58-71.

Jo

[17] A. Sadollah, H. Eskandar, A. Bahreininejad, J.H. Kim, Water cycle, mine blast and improved mine blast algorithms for discrete sizing optimization of truss structures, Comput. Struct. 149 (2015) 1–16. [18] A.A. Heidari, R.A. Abbaspour, A.R. Jordehi, An efficient chaotic water cycle algorithm for optimization tasks, Neural Comput. Appl. 28 (2017) 57-85.

[19] S. Khalilpourazari, S. Khalilpourazary, An efficient hybrid algorithm based on Water Cycle and Moth-Flame Optimization algorithms for solving numerical and constrained engineering optimization problems, Soft Comput. 2013 (2017) 1-24. [20] A. Sadollah, H. Eskandar, J.H. Kim, Water cycle algorithm for solving constrained multi-objective optimization problems, Appl. Soft Comput. 27 (2015) 279-298.

Journal Pre-proof [21] A. Sadollah, H. Eskandar, A. Bahreininejad, Water cycle algorithm for solving multi-objective optimization problems, Soft Comput. 19 (2015) 2587-2603. [22] S.M.A Pahnehkolaei, A. Alfi, A. Sadollah, J.H. Kim, Gradient-based Water Cycle Algorithm with evaporation rate applied to chaos suppression, Appl. Soft Comput. 53 (2017) 420-440. [23] A.A. Heidari, R.A. Abbaspour, A.R. Jordehi, Gaussian bare-bones water cycle algorithm for optimal reactive power dispatch in electrical power systems. Appl. Soft Comput. 57 (2017) 657–671. [24] S. Qiao, Y. Zhou, Y. Zhou, R. Wang, A simple water cycle algorithm with percolation operator for clustering analysis, Soft Comput. 1 (2018) 1-15.

pro of

[25] Y. Xu, Y. M, A modified water cycle algorithm for long-term multi-reservoir optimization, Appl. Soft Comput. 71 (2018) 317-332.

[26] O.B. Haddad, M. Moravej, H.A.Loaicifa, Application of the water cycle algorithm to the optimal operation of reservoir systems, J. Irrig. Drain. Eng. 141 (5) (2014) 04014064.

[27] Y. Kong, Y. Mei, W. Li, X. Wang, B. Yue, An enhanced water cycle algorithm for optimization of multi-reservoir systems. Ieee/acis, International Conference on Computer and Information Science. IEEE. 2017

[28] F. Alba, B. Dorronsoro, The exploration/exploitation tradeoff in dynamic cellulargenetic algorithms, IEEE Trans. Evolut. Comput. 9 (2005) 126–142.

Knowl. Intel. Eng. Sys. 13 (2009) 185–206.

re-

[29] S.H. Liu, M. Mernik, B.R. Bryant, To explore or to exploit: an entropy-drivenapproach for evolutionary algorithms, Int. J. [30] H. Liu, M. Mernik, D. Hrnˇciˇc, M.ˇCrepinˇsek, A parameter control method of evolutionary algorithms using exploration and exploitation measures with apractical application for fitting Sovova’s mass transfer model, Appl. Soft Comput. 13 (2013) 3792–3805.

[31] R. Tanabe, A. Fukunaga, Success-history based parameter adaptation for differential evolution, Proceedings of IEEE Congress

lP

on Evolutionary Computation (2013) 71-78.

[32] N.H. Awad, M.Z. Ali, J.J. Liang, B.Y. Qu, P.N. Suganthan, Problem Definitions and Evaluation Criteria for the CEC 2017 Session and Competition on Single Objective Real-Parameter Numerical Optimization, 2017. [33] P. Civicioglu, E. Besdok, M.A. Gunen, U.H. Atasever. Weighted differential evolution algorithm for numerical function optimization: a comparative study with cuckoo search, artificial bee colony, adaptive differential evolution, and back tracking

urn a

search optimization algorithms, Neural Computing and Applications (2018) 1-15. [34] I.B. Aydilek. A Hybrid Firefly and Particle Swarm Optimization Algorithm for Computationally Expensive Numerical Problems. Applied Soft Computing, 66 (2018) 232-249.

[35] E.M. Montes, C.A.C Coello. Useful Infeasible Solutions in Engineering Optimization with Evolutionary Algorithms. MICAI 2005: Advances in Artificial Intelligence. MICAI 2005. Lecture Notes in Computer Science, vol 3789. Springer, Berlin, Heidelberg.

[36] B.K. Kannan, S.N. Kramer. An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. J Mech Des 1994; 116:405–11.

Jo

[37] J.S. Arora. Introduction to optimum design. New York: McGraw-Hill; 1989. [38] C.A.C. Coello. Use of a self-adaptive penalty approach for engineering optimization problems. Computers in Industry, 41(2) (2000) 113–127.

[39] N. Hansen, A. Ostermeier. Completely Derandomized Self-Adaptation in Evolution Strategies. Evolutionary Computation, (2001) 9(2):159-195.

[40] E. Cuevas, M. Cienfuegos, D. Zaldivar, et al. A swarm optimization algorithm inspired in the behavior of the social-spider. Expert Systems with Applications, 2014, 40(16):6374-6384. [41] S. Mirjalili, S.M. Mirjalili, A. Lewis. Grey Wolf Optimizer. Advances in Engineering Software, 2014, 69:46-61. [42] S. Mirjalili, A. Lewis. The Whale Optimization Algorithm. Advances in Engineering Software, 2016, 95:51-67.

Journal Pre-proof [43] F. Fausto, E. Cuevas, A. Valdivia, et al. A Global Optimization Algorithm Inspired In the Behavior of Selfish Herds. Biosystems,

Jo

urn a

lP

re-

pro of

2017, 160: 39-55.

*Declaration of Interest Statement

Journal Pre-proof

Conflict of interest statement

pro of

We declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of, the manuscript entitled.

Caihua Chen [email protected]

Huachao Dong [email protected]

Jo

urn a

lP

Xinjing Wang [email protected]

re-

Peng Wang [email protected]