Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm

Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm

Knowledge-Based Systems xxx (xxxx) xxx Contents lists available at ScienceDirect Knowledge-Based Systems journal homepage: www.elsevier.com/locate/k...

727KB Sizes 0 Downloads 24 Views

Knowledge-Based Systems xxx (xxxx) xxx

Contents lists available at ScienceDirect

Knowledge-Based Systems journal homepage: www.elsevier.com/locate/knosys

Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm✩ ∗

Danilo Pelusi a , , Raffaele Mascella a , Luca Tallini a , Janmenjoy Nayak b , Bighnaraj Naik c , Yong Deng d a

Faculty of Communications Sciences, University of Teramo, Italy Department of Computer Science and Engineering Aditya Institute of Technology and Management, Tekkali, K Kotturu, India c Department of Computer Application Veer Surendra Sai University of Technology, Odisha, India d Institute of Fundamental and Frontier Science, University of Electronic Science and Technology of China, Chengdu, China b

article

info

Article history: Received 28 May 2019 Received in revised form 14 December 2019 Accepted 17 December 2019 Available online xxxx Keywords: Evolutionary algorithms Gravitational search algorithm Optimization algorithms Exploration and exploitation Hyperbolic functions

a b s t r a c t Finding the best compromise between exploration and exploitation phases of a search algorithm is a hard task. The Gravitational Search Algorithm (GSA) is an evolutionary algorithm based on the Newton’s universal law of gravitation. Many studies show that GSA suffers from slow exploitation which generates premature convergence. This paper proposes a Hyperbolic Gravitational Search Algorithm (HGSA) able to find an optimal balance between exploration and exploitation. The main contributions of this work are: the definition of suitable hyperbolic acceleration coefficients, the dynamic regulation of the gravitational constant coefficient through hyperbolic function and the definition of a decreasing hyperbolic function for the best agents which attract the other agents. The proposed algorithm is compared with well-known search algorithms on classical benchmark functions, CEC -06 2019 benchmark functions and three-bar truss design problem. The results show that HGSA is better than other algorithms in terms of optimization and convergence performances. © 2019 Elsevier B.V. All rights reserved.

1. Introduction The challenge for optimizing complex problems with suitable optimization algorithms is a hot topic. On the other hand, there is no algorithm able to solve all optimization problems [1]. This means that a technique may work well on a set of problems and fails in different problems sets. In order to address this issue, Evolutionary Algorithms (EA) and population-based optimization methods have been widely used. The key point is to find the best compromise between exploration and exploitation to assure an optimal solution. Exploration is used to locate new solutions and avoid trapping in a local convergence, whereas exploitation is used to find the best solution. However, many EA work well either during the exploration or exploitation. There are EA with good exploration search, but weak exploitation search. On the other hand, exploration searches the optimum over all the search ✩ No author associated with this paper has disclosed any potential or pertinent conflicts which may be perceived to have impending conflict with this work. For full disclosure statements refer to https://doi.org/10.1016/j.knosys. 2019.105404. ∗ Corresponding author. E-mail addresses: [email protected] (D. Pelusi), [email protected] (L. Tallini), [email protected] (J. Nayak), [email protected] (B. Naik), [email protected] (Y. Deng).

space, whereas exploitation searches the best solution exploiting promising solutions found in the exploration phase. Among heuristic evolutionary algorithms, the Gravitational Search Algorithm (GSA) is inspired by the interaction between masses in nature [2]. Although GSA is already outdated nowadays, further versions of GSA for engineering applications have been proposed. In the field of renewable energy power generation, a maximum power point tracking method with improved GSA has been proposed [3]. GSA is also applied in data classification research area for investigating the existence of non-informative features in feature sets [4]. Chaudhary and Kumar [5] proposed a hybrid genetic-GSA algorithm for load scheduling in Cloud Computing. Moreover, GSA is able to solve multiple conflicting objectives such as minimization of makespan, load-balancing, and energy-consumption [6]. Khatibinia and Yazdani [7] designed an accelerated multi-gravitational search algorithm for size optimization of truss structures. Another GSA research field regards the renewable energy sources. In particular, the maximum exergy efficiency of a geothermal power plant is optimized by GSA [8]. An improved GSA is applied to the water resource management where the streamflow prediction plays a significant role [9]. The non-convex combined heat and power economic dispatch problems have been solved through the application of GSA [10]. Moreover, the short term hydrothermal coordination problem is addressed with the use of GSA [11]. Constrained versions of GSA

https://doi.org/10.1016/j.knosys.2019.105404 0950-7051/© 2019 Elsevier B.V. All rights reserved.

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

2

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

have been proposed to solve simple and hydropower operation problem of a reservoir [12]. In order to find automatically the number of clusters in a given dataset, GSA has been used [13]. GSA variants for face recognition have been introduced in [14]. The main variants of this approach are the automated selection of projection vectors within GSA and a stochastic local neighborhood based search. Kang et al. [15] proposed a hybrid GSA to mitigate premature convergence and loss in particle information problem in Swarm Intelligence based tracker. In GSA, each solution is matched to an agent with its mass, position and velocity. The quality of solution depends on the agents masses. Moreover, GSA optimization process shows weakness during the exploitation phase. Therefore, many researches have attempted of improving GSA starting from exploitation phase and investigating on a good trade-off between exploration and exploitation. This fact makes the research on GSA an interesting topic. In order to achieve a good exploration–exploitation balance, an Accelerated Multi-Gravitational Search Algorithm (AMGSA) has been designed [7]. In this approach, the simplex crossover and the operator mutation of the breeder genetic algorithm are incorporated with the Multi-Gravitational Search Algorithm (MGSA). The algorithm calculates the current velocity of agent considering the memory of ith agent, the current global best position and the position of a randomly-selected agent. However, AMGSA suffers from computational complexity problems. Mirjalili and Hashim [16] proposed the idea of integrating the ability of exploitation in Particle Swarm Optimization (PSO) [17] with the ability of exploration in GSA to synthesize both algorithms’ strength. Saving and using the location of the best mass to speed up the exploitation phase is the basic idea proposed by Mirjalili and Lewis [18]. A memory-based approach has been introduced in GSA by storing the best position of any agent as the agent’s personal best position [19]. Bohat and Arya [20] emphasized the movement of search agents toward the group-best (gbest) solution among the current population of agents. Nonlinear timevarying methods have been introduced in GSA to update the velocity of agents [21]. In order to solve the problem of premature convergence of GSA due to rapid reduction of diversity, the updating position of bird flock response is used [22]. A modified GSA with crossover (CROGSA) has been proposed by Yin et al. [23] with the insertion of the promising information into the new position to obtain a new trial solution. A diversity-enhanced and memory-based multi-objective gravitational search algorithm has been designed by Sun et al. [24]. In this approach, the leaders of a particle include its personal best particle, the global best particle, and all the current population particles. Moreover, the diversity-enhancement mechanism is based on particle velocity and position control. In order to control the balance between exploration and exploitation, Li et al. [25] proposed a time varying maximum velocity equation. Levy flight disturbance has been incorporated in GSA search process with the aim of improving the search performances [26]. In particular, the approach proposed in [26] exploits also the combination between Differential Evolutionary (DE) [27] and evolutionary algorithms to enhance the search ability and maintain the population diversity. Moreover, the authors introduced a self-adaptive mechanism which gets information from current population to control the value of the gravitational constant G through changes of its attenuation factor. In fact, an important role in GSA exploration and exploitation is played by the gravitational constant attenuation factor α . Sun et al. [28] proposed of adjusting the value of α according to agents position and fitness feedback. Moreover, this algorithm restricts the value of α to preserve agents stable trajectories. In the revised GSA proposed by Das et al. [29], the parameter α is computed for each iteration. To achieve a balance

between exploration and exploitation during the search process, an adaptive gravitational constant attenuation factor has been defined [30]. A time-linearly decreasing gravitational constant has been introduced in GSA for extending the solution search space [31]. In order to control flexibly the decreasing rate of gravitational constant, Li et al. [32] proposed a piecewise function. Ji et al. [33] adjusted the gravitational constant redefining the parameters G0 and α . In such work, G0 is computed as the median of the ordered distances between agents, whereas α is calculated by means of a Gaussian distribution. Moreover, the proposed algorithm uses a modified Chaotic Local Search (CLS) to define the best agent position. A similar approach has been proposed by Jiang et al. [34] with an algorithm which employs chaotic perturbation operator and memory strategy to avoid premature convergence. Chaotic local search procedures are applied in GSA to search the optimal solution around the current best solution [25]. Moreover, piecewise linear chaotic map, gauss map, sinusoidal map and sinus map have been added to GSA [35]. Some chaotic maps have been embedded into the gravitational constant G with the attempt to improve the exploration of GSA [36]. The results show that sinusoidal maps helps GSA to avoid local minimum better than the other chaotic maps. An adaptive gravitational constant with individual optimum fitness feedback is dynamically calculated in the algorithm proposed by [37]. Wang et al. [38] designed a chaotic binary coded gravitational search algorithm (CBGSA), where the position is updated so that the current bit value is changed by a probability calculated according to the mass velocity. Piecewise linear chaotic map and sequential quadratic programming have been applied in GSA for feature subset selection in machine learning [39]. Gouthamkumar et al. [40] integrated in GSA the disruption operator to increase GSA exploration and exploitation abilities. Li et al. [41] modified GSA by introducing a suitable population diversity definition. On the other hand, Minarova et al. [42] introduced a suitable function in the calculation of the product between masses in the interaction force formula. Li et al. [43] combined the Cauchy and Gaussian mutations with the traditional GSA to improve the search ability of the algorithm. In GSA, each agent learns from all the agents stored in the same elite group. The novelty is the introduction of a locally informed learning strategy where each agent learns from its unique neighborhood of suitable size [44]. The Kbest model has been replaced with a dynamic neighborhoods strategy [45]. The basic idea proposed in [46] consists on two phases of neighborhood selection for each individual. In the first one, the neighborhoods are formed by nearest neighbors to explore the search space. The second one uses the improved application of hill valley algorithm [47] in order to merge species from previous phase to exploit found optima. On the other hand, selecting the size of neighborhood is a critical problem. Small values of neighborhood size lead the algorithm to produce many virtual niches, whereas large values lead the algorithm to lose some existing niches. Yazdani et al. [48] defined an adaptive parameter in GSA to compute the size of neighborhood. Mittal et al. [49] introduced the chaotic model in the parameter Kbest of GSA to balance the exploration and exploitation. An exponentially decreasing Kbest has been defined to find the optimal thresholds in 2D histogram [50]. The aim of this paper is to improve the balance between exploration and exploitation of GSA without increasing its computational complexity. The proposed approach contains three novel contributions. The first novelty is the introduction of acceleration coefficients which depend on suitable hyperbolic sin functions. A dynamic regulation of the gravitational constant coefficient through hyperbolic function is the second contribution. The third novelty is a hyperbolic-based definition of the parameter Kbest which assures a decreasing trend. Therefore, a Hyperbolic Gravitational Search Algorithm (HGSA) is designed.

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

The remainder of the paper is organized as follows. GSA is described in Section 2. Section 3 illustrates the proposed algorithm. The discussion of results is contained in Section 4. Section 5 contains HGSA results to solve a real-life problem. The conclusions are given in Section 6.

Rij (t) = ∥Xi (t), Xj (t)∥2

( G(t) = G0 exp −α

Xi = (x1i , . . . , xdi , . . . , xDi ) where d is the dth dimension and i = 1, 2, . . . , n. GSA starts with a random generation of particles position in the D-dimension search space. The fitness evaluation of each particle is obtained by using the objective function fit. Such fitness function depends on the position of ith agent Xi , i.e. fit = fit(Xi ). Once the current population fitness is evaluated, the active Mai and passive Mpi gravitational masses, and the inertial mass Mii are computed as follows: Mai = Mpi = Mii = Mi , i = 1, 2, . . . , n mi (t) =

fit(Xi (t)) − worst(t) best(t) − worst(t)

Mi (t) = ∑n

j=1

Fid (t) =

(3)

mj (t)

T

(8)

n ∑

randj Fijd (t)

(9)

where Kbest are the best agents which attract the other agents and randj is an uniformly distributed random variable in [0, 1]. By using (9) and according to the law of motion, the acceleration of the agent i at time t, and in direction dth, adi (t), is given by (10), with Mii inertial mass of ith agent. adi (t) =

Fid (t) Mii (t)

(10)

The velocity v and the position x of an agent at t + 1 time are calculated by (11) and (12) respectively, with ∆t = 1, randj uniformly distributed random variable in [0, 1] and adi (t) defined by (10).

vid (t + 1) = randi × vid (t) + adi (t)∆t

(11)

xdi (t + 1) = xdi (t) + vid (t + 1)∆t

(12)

The steps of GSA are described in Algorithm 1. Algorithm 1 Gravitational Search Algorithm 2: 3:

where the active gravitational mass Mai is a measure of the strength of the gravitational field due to a particular particle, the passive gravitational mass Mpi is a measure of the strength of an particle’s interaction with the gravitational field and the inertial mass Mii is a measure of a particle resistance to changing its state of motion when a force is applied. In Eq. (2), best(t) and worst(t) indicate respectively the best and the worst particle with regards to their fitness value, where t stands for tth iteration. In minimization problems, these quantities may be defined as in (4) and (5).

)

j∈Kbest ,j̸ =i

1:

mi (t)

t

The total force acting on the ith particle in a dimension d, Fid (t) is calculated as in (9).

(1) (2)

(7)

The gravitational constant G(t) is computed by means of (8), where G0 is set to 100 and α to 20 (see [2]); T is the total number of iterations, with t = 1, 2, . . . , T .

2. Gravitational search algorithm GSA is a heuristic algorithm based on Newton’s law of gravity and motion [2]. In GSA, each particle attracts every other one by means of the gravitational force. The magnitude of the gravitational force is proportional to the product of their masses and inversely proportional to the square of their distance. The acceleration of the particle is proportional to the force acted on the particle and inversely proportional to the mass of the particle. In GSA, particles are considered as agents whole performances are defined by their masses. The masses are determined by using a fitness function. Moreover, each agent position represents a candidate solution to the search problem. The main steps of GSA are the following. Let n be the number of agents. The position Xi of the ith agent in the search space of dimension D is

3

4: 5: 6: 7: 8: 9:

Random initialization of the population in the search space for t from 1 to T do Evaluate the fitness fit of agents Update G(t), best(t), w orst(t) and Mi (t) for i = 1, ..., n by means of Eqs. (8), (4), (5) and (3) respectively. Compute the total force in different directions based on Eqs. (6) and (9). Compute the acceleration through Eq. (10) and the velocity by (11). Update the position of agents by (12). end for Supply the best agent position

best(t) = min fit j (t)

(4)

3. Proposed algorithm

worst(t) = max fit j (t)

(5)

Using adaptive coefficients to balance exploration and exploitation in GSA is a good method for improving GSA search performances. The first contribution of this work is the definition of suitable acceleration coefficients able to improve the exploitation phase of GSA. In GSA, the gravitational constant G decreases progressively with the iterations (see (8)). Thus, the gravitational force tends to decrease as well as the acceleration of particles (see (6) and (10)). Moreover, this acceleration is also reduced due to heavy particles. Therefore, heavy masses with slow speed and attractive force intensity degrade the convergence speed. To solve this problem, Mirjalili [18] proposed of saving and using the location of the best mass to speed up the exploitation phase. In particular, the gbest element applies an additional velocity component through

j∈1,...,n

j∈1,...,n

Note that, from (2) it follows that the particle with higher mass is the more efficient agent. Hence, more efficient agents have greater attractions and move more slowly. Once that the mass has been properly defined, the next step is the computation of the forces exerted on each particle by all other ones. Thus, the quantity Fijd (t) in (6) represents the force applied on mass i from mass j at time t, where Maj and Mpi are the active and passive gravitational masses related to agent j and i respectively and ϵ a small value. Rij (t), as defined in (7), is the Euclidean distance between agents i and j. Fijd (t) = G(t)

Mpi (t) × Mai (t) Rij (t) + ϵ

(xdj (t) − xdi (t))

(6)

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

4

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

Fig. 1. Acceleration coefficients.

the last known location for the best mass. This approach can be formalized by means of Eq. (13)

vid (t + 1) = randi ×vid (t) + c1 (t) × adi (t)∆t + c2 (t) × (gbest − xdi (t))/∆t (13) where v is the velocity of ith agent in dth dimension at iteration t, adi (t) is the acceleration of ith agent in dth dimension at iteration t, gbest is the position of the best solution acquired so far and c1 , c2 are the acceleration coefficients. Mirjalili [18] proposed an increasing function for c2 and a decreasing function for c1 over the cubic of iterations number, i.e. c1 , c2 ∝ t 3 . The same approach has been used in [30] for the parameter identification of hydraulic turbine governing system. Bohat and Arya [20] defined the acceleration coefficients as linear functions of t, i.e. c1 , c2 ∝ t. Adaptive adjustment methods provided a suitable definition of c1 and c2 as function of t 1/6 [45]. An exponential relationship has been introduced in [21] with c1 , c2 ∝ exp(−t 2 ). Our idea is a novel definition of these acceleration coefficients. From Eq. (13), it follows that if c1 is bigger than c2 , the search procedure tends to perform exploration, while if c2 is greater than c1 the search procedure tends to execute exploitation. The main problem is to find a gradual transition between these two phases. Mirjalili [18] achieve good results by defining the transition exploration–exploitation to the threshold tth = (4/5)T , with T maximum iterations number. Our idea is to bring forward this transition and increase the changing rate of acceleration coefficients. A greater speed changing of c1 and c2 than [18] assures better performances in search phases. In fact, a faster variation of acceleration coefficients produces a smaller acceleration of heavy masses and a greater acceleration of other particles. Thus, the center of mass of particles tends to achieve optimal solutions more quickly than [18,20,21] and [45]. A faster exploration phase causes a longer exploitation phase which exploits more gbest positions than [18]. On the base of the previous concepts, the acceleration coefficients c1 and c2 are defined as in (14) and (15) respectively. d i (t)

c1 (t) = (sinh(2) + sinh(−2 ∗ t /T ))/sinh(2)

(14)

c2 (t) = sinh(2 ∗ t /T )/sinh(2)

(15)

The graph of the acceleration coefficients c1 and c2 over the iterations number t is shown in Fig. 1. Note that, the transition between exploration and exploitation occurs before the threshold ′ tth , i.e. tth < (4/5)T (where tth′ is the threshold of the proposed c1

and c2 ). Finally, the proposed acceleration coefficients depends on suitable hyperbolic sine functions. GSA uses the distance R between different particles for computing the acceleration (see (6) and (10)). Big values of R between agents produce less force values, so the acceleration of agents is small. When the distances between agents become small, the gravitation force increases and consequently the acceleration of agents increases. This behavior of agents causes premature convergence because the particles with slow acceleration are not able to escape from local optima. Moreover, a huge acceleration at the end of iteration does not help the fine-regulation of the current search area. The gravitational constant G(t) controls the impact of gravitational force which depends on R. Therefore, the balance between exploration and exploitation is dictated by the value of G(t). A larger G(t) value makes GSA move toward global exploration, while a smaller gravitational constant makes GSA move toward fine-regulation of the current search area. In order to improve exploration and exploitation of GSA, the gravitational constant G(t) must decrease over t. However, the gravitational constant has to satisfy specific conditions to enhance the stable convergence of GSA [34]. On the other hand, G(t) depends on the parameter α (see (8)). The idea is to adjust α dynamically with the aim of increasing the convergence rate and alleviating premature convergence. Because G(t) must decrease, than α must increase with t. Therefore, we define α as in (16)

α (t) =

αmax − αmin sinh(2)

( sinh

2t T

)

+ αmin ;

(16)

where αmax and αmin define the variation range of α . These values satisfy the boundary constraint conditions of α defined in [28]. Note that, α is defined as function of a hyperbolic function. Similar approaches have been proposed by [30] and [41]. However, in both cases, the proposed definition of α (t) depends on scale and shift factors. Conversely, in our case, α (t) only depends on quantities computed inside GSA (maximum number of iteration T , extrema values αmin , αmax of α and iterations number t). The key idea of our approach is to define a suitable fast increasing function over t avoiding the dependence from shape parameters. The parameter Kbest of GSA in (9) is the set of first k agents with the best fitness value and greatest mass. It decreases linearly from n to 1. This means that, at the beginning, all particles apply the force and at the end will be just one particle to apply force to the others. Thus, the Kbest particles which attract the others regulate the exploration and exploitation of GSA. Conversely to GSA approach, Sun et al. [44] proposed an algorithm where each agent learns from its unique neighborhood formed by k local

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

neighbors and historically global best agent. Moreover, a dynamic neighborhood learning strategy has been introduced in GSA [45]. However, in both approaches, the computational complexity of the proposed algorithms is high. Mittal et al. [50] defined Kbest as an exponentially decreasing function of t, i.e. Kbest ∝ (1/n)t /T . On the other hand, Pelusi [51] defined Kbest ∝ exp(−5t /T ). The target is to find the best compromise between exploration and exploitation defining suitably Kbest. The idea is to increase the decreasing rate, so that the transition from exploration to exploitation is anticipated. The proposed definition of Kbest is shown in (17), where fp is the percent of agents applying force on others. Kbest(t) = n +

fp − n sinh(−2)

( sinh −

2t T

(17)

Algorithm 2 Hyperbolic Gravitational Search Algorithm 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13:

Table 1 Unimodal functions. Unimodal functions F 1(X ) =

∑D

F 2(X ) =



F 3(X ) =

2 i=1 xi D i=1 xi

[−100; 100]D

| |+ ∑D (∑i i=1

S

∏D

j=1

i=1

[−10; 10]D

xj

[−100; 100]D

| xi | )2

[−100; 100]D

F 4(X ) = maxi {|xi |, 1 ≤ i ≤ D} F 5(X ) =

∑D−1 [

F 6(X ) =

∑D

F 7(X ) =

∑D

i=1

i=1 i=1

100 (xi+1 − xi ) + (xi − 1) 2

2

]

([xi + 0.5])

2



x4i

[−30; 30]D [−100; 100]D

+ random[0, 1)

[−1.28; 1.28]D

)

The use of hyperbolic function provides a greater convergence speed to 1 than Kbest proposals of [50] and [51]. This fact assures a suitable trade-off between exploration and exploitation. The provided contributions are all based on hyperbolic functions, therefore a Hyperbolic Gravitational Search Algorithm (HGSA) is designed. The steps of HGSA are described in Algorithm 2.

1:

5

Random initialization of the population in the search space for t from 1 to T do Evaluate the fitness fit of agents Compute the acceleration coefficient α (t) by (16) Calculate Kbest by means of (17) Update G(t), best(t), w orst(t) and Mi (t) for i = 1, ..., n by means of (8), (4), (5) and (3) respectively. Compute the total force in different directions based on Eqs. (6) and (9)). Compute the acceleration through Eq. (10) Calculate c1 (t) and c2 (t) with (14) and (15) respectively. Compute the velocity by (13). Update the position of agents by (12). end for Supply the best agent position

The computational complexity of an algorithm is a suitable metric to assess its performance. To analyze the computational complexity of HGSA, the approaches proposed by Dhiman and Kumar [52,53] are considered. Let no and np be the number of objectives and population size, respectively. The population initialization process of the proposed algorithm costs O(no × np ). The complexity to compute the particles fitness requires O(T × of ), where of represents the objective function for a given problem. The time required to simulate the whole procedure is O(N). Moreover, HGSA needs O(f 2 ) time to calculate the force component. Finally, the proposed algorithm has the following time complexity: Tc = O(N × f 2 × T × no × np × of ) The space complexity Sc of HGSA is the maximum amount of space used at any time which is considered during its initialization process, i.e. Sc = O(no × np ). 4. Experimental results and comparisons In order to assess the performances of the proposed algorithm, HGSA is tested on the well-known benchmark functions shown in Tables 1–3. In these test functions, D is the dimension

Table 2 Multimodal functions with variable dimension. Multimodal functions

S

√ |xi | i=1 −xi sin ] ∑D [ 2 F 9(X ) = i=1 xi − 10 cos(2π xi + 10) ( ) √ ∑ D 2 F 10(X ) = −20 exp −0.2 D1 + i=1 xi ( ∑ ) − exp D1 Di=1 cos(2π xi ) + 20 + e ( ) ∑D 2 ∏D x 1 √i +1 F 11(X ) = 4000 i=1 xi − i=1 cos i ∑ D−1 π F 12(X ) = D {10 sin(π y1 ) + i=1 (yi − 1)2 [1 F 8(X ) =

∑D

[−500; 500]D [−5.12; 5.12]D [−32; 32]D

[−600; 600]D [−50; 50]D

+10 sin (π yi+1 )] + (yD − 1) } ∑D i=1 u(xi , 10, 100, 4) 2

2

+

yi = 1 +

xi +1 4

⎧ ⎨k(xi − a)m , xi > a a < xi < a u(xi , a, k, m) = 0, ⎩k(−x − a)m , x < −a i i ∑D F 13(X ) = 0.1{sin2 (3π x1 ) + i=1 (xi − 1)2 [1

[−50; 50]D

+ sin (3π x1 + 1)] + (xD − 1) [1 + sin (2π xD )]} ∑D i=1 u(xi , 5, 100, 4) 2

2

2

+

of search range. The main features of the benchmark functions (global optimum value, optimum location, specific parameter setting) can be found in [2]. Table 1 shows the unimodal functions, whereas Tables 2 and 3 contain the more complex multimodal functions. The experimental results are compared with the following well-known heuristics optimization algorithms: Firefly Algorithm (FA) [54], GSA, Multi-Verse Optimizer (MVO) [55], Bat Algorithm (BA) [56], Harmony Search (HS) [57] and Moths-Flames Optimization (MFO). FA reproduces the flashing light patterns of fireflies found in nature. Its configuration consists of procedures which update the distance between two considered fireflies. Fireflies are unisexual so that one firefly will be attracted to other fireflies regardless of their sex. Moreover, the attractiveness between a couple of fireflies is proportional to the brightness. Either the attractiveness and brightness decrease as the distance between fireflies increases. This behavior is similar to GSA particles, thus FA is chosen as a comparison algorithm. To emphasize the improvements of HGSA with respect to GSA, GSA is chosen as one of competitor algorithms. The choice of MVO as competitor algorithm depends on its physical inspiration. In fact, it is inspired by the theory of multiverse in physics which consists of three main concepts: white hole, black hole, and worm hole. On the other hand, GSA is also inspired by physical phenomena because it is based on the gravitation theory. Moreover, HGSA tends to find the best balance between exploration and exploitation. At the same way, MVO uses the concepts of white and black holes for exploration and worm hole for exploitation.

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

6

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx Table 3 Multimodal functions with fix dimension. Multimodal functions with fix dimension

( F 14(X ) =

1 500

F 15(X ) =

∑11

i=1

+

∑25

1 j=1 j+∑2 (x −a )6 ij i=1 i

[ ai −

x1 (b2i +bi x2 )

S

)−1

[−65.53; 65.53]2

]2

[−5; 5]4

b2i +bi x3 +x4

F 16(X ) = 4x21 + 2.1x41 +

1 6 x 3 1

+ x1 x2 − 4x22 + 4x42 )2 F 17(X ) = x2 − 45π.12 x21 + π5 x1 − 6 + 10(1 − 81π ) cos x1 + 10

[−5; 5]2

(

2

F 18(X ) = [1 + (x1 + x2 + 1) (19 − 14x1 +

+6x1 x2 +

3x22 )

3x21

[−5; 10] × [0; 15] [−5; 5]2

− 14x2

2

] × [30 + (2x1 − 3x2 ) × (18 − 32x1 +

12x21

+48x2 − 36x1 x2 + 27x22 )] ( ∑ ) ∑4 3 F 19(X ) = − i=1 ci exp − j=1 aij (xj − pij )2 ( ∑ ) ∑4 6 F 20(X ) = − i=1 ci exp − j=1 aij (xj − pij )2 ∑5 F 21(X ) = − i=1 [(X − ai )(X − ai )T + ci ]−1 ∑7 F 22(X ) = − i=1 [(X − ai )(X − ai )T + ci ]−1 ∑10 F 23(X ) = − i=1 [(X − ai )(X − ai )T + ci ]−1

[0; 1]3 [0; 1]6 [0; 10]4 [0; 10]4 [0; 10]4

Table 4 Comparison of HGSA with some optimization algorithms for the test functions F 1–F 7. FA

GSA

MVO

BA

HS

MFO

HGSA

F1

mean std

1.839e−16 2.663e−17

5.749e−18 2.837e−18

3.485e−03 1.510e−03

3.512e+04 1.228e+04

2.045e+02 1.154e+02

5.448e−30 1.252e−29

4.673e−37 1.088e−36

F2

mean std

5.837e−09 4.179e−10

6.213e−09 1.238e−09

1.871e−02 7.885e−03

7.402e+06 2.735e+07

8.552e−01 4.441e−01

6.667e−01 2.537e+00

3.350e−18 5.812e−18

F3

mean std

8.045e−09 2.050e−08

4.151e−05 1.727e−04

3.072e−02 2.134e−02

9.488e+04 4.262e+04

1.670e+03 6.953e+02

5.000e+02 1.526e+03

5.002e−32 1.267e−31

F4

mean std

3.599e+00 3.993e+00

1.666e−09 3.673e−10

4.284e−02 1.228e−02

6.825e+01 9.302e+00

2.166e+01 5.268e+00

2.994e+00 6.015e+00

7.875e−18 1.370e−17

F5

mean std

2.963e+01 2.092e+01

5.567e+00 2.719e−01

1.059e+02 3.096e+02

8.087e+07 3.669e+07

1.117e+04 9.991e+03

3.084e+03 1.642e+04

1.227e+01 2.571e+01

F6

mean std

1.837e−16 2.476e−17

0.000e+00 0.000e+00

3.362e−03 1.544e−03

3.673e+04 9.979e+03

1.879e+02 1.059e+02

3.776e−30 7.850e−30

0.000e+00 0.000e+00

F7

mean std

2.067e−03 9.255e−04

9.394e−03 3.993e−03

1.606e−03 1.072e−03

6.973e+01 3.333e+01

6.028e−02 2.788e−02

5.800e−03 3.668e−03

4.642e−03 2.815e−03

21 3

18 2

25 4

48 7

41 6

27 5

10 1

Friedman value Friedman rank

BA is designed by idealizing some of the echolocation characteristics of micro-bats. All bats use echolocation to sense distance and they fly randomly with velocity v at position x with a fixed frequency. This frequency is adjusted according to the proximity of their target. Similarly, in GSA, the position of particles depends on their acceleration. In both cases, two physical quantities are involved. Therefore, BA is chosen as comparison algorithm. HS is a population based algorithm that maintains a set of solutions in the Harmony Memory. It is an efficient algorithm for finding a set of solutions in a large search region. This is the main reason for choosing HS as competitor algorithm. MFO is based on the transverse orientation of moths with respect the moon. GSA particles position depends on their acceleration, whereas MFO moths position is defined by a specific function (the logarithmic spiral). This fact makes the comparison between HGSA and MFO very interesting. The study on the single objective optimization algorithms is the basis of the research on the more complex optimization algorithms such as multi-objective optimizations algorithms [58]. All novel evolutionary and swarm algorithms are tested on single objective benchmark problems. Therefore, the proposed algorithm is also applied on CEC -06 2019 single objective optimization benchmark functions from CEC 01 to CEC 10 [59]. The comparison algorithms are the same of the previous case, that is FA, GSA, MVO, BA, HS and MFO.

The parameters values for the comparison are the following: number of agents n equal to 30, search space dimension D equal to 10 and max number of iterations T = 1000. Moreover, to alleviate stochastic errors and obtain statistical results, each algorithm has been repeated 30 times independently (as in [2]). Refer to HGSA, because in GSA α0 is set to 20, thus αmin and αmax in (16) are set to 10 and 50 respectively. 4.1. Comparison of HGSA for unimodal functions HGSA is applied to the unimodal functions shown in Table 1. The Table 4 shows the performance of the proposed and comparison algorithm on the benchmark functions F 1–F 7. The mean and standard deviation of best solutions are calculated on 30 runs. The results show that HGSA is able to find the best solution for F 1–F 4 and F 6. Note that, for F 3, the improvement is of 23 magnitude order with respect to the second-best algorithm FA. For the test function F 5, the proposed algorithm is the secondbest algorithm and the third one for F 7. Finally, the results reveal that the performances of HGSA are better than other algorithms. 4.2. Comparison of HGSA for high dimension multimodal functions The multimodal functions with high dimension provide a tough path for each optimization algorithm because they contain

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

7

Table 5 Comparison of HGSA with some optimization algorithms for the test functions F 8–F 13. FA

GSA

MVO

BA

HS

MFO

HGSA

F8

mean std

−6.354e+03 5.667e+02

−1.468e+03 2.131e+02

−2.982e+03 3.180e+02

−7.044e+03 4.647e+03

−3.922e+03 1.284e+02

−3.170e+03 3.749e+02

−2.418e+03 3.165e+02

F9

mean std

5.797e+01 1.748e+01

6.003e+00 2.619e+00

1.141e+01 4.354e+00

3.656e+02 3.548e+01

1.017e+01 2.479e+00

2.635e+01 1.726e+01

7.628e+00 3.595e+00

F 10

mean std

3.154e−09 2.112e−10

3.213e−09 7.535e−10

1.566e−01 4.023e−01

1.976e+01 8.010e−01

5.859e+00 1.489e+00

3.850e−02 2.109e−01

5.507e−15 1.656e−15

F 11

mean std

4.270e−03 6.028e−03

2.404e−02 4.311e−02

3.187e−01 1.398e−01

3.117e+02 9.425e+01

2.693e+00 8.829e−01

1.423e−01 7.944e−02

1.477e−02 1.005e−02

F 12

mean std

9.683e−02 1.964e−01

1.220e−19 5.993e−20

1.511e−04 1.105e−04

1.404e+08 8.849e+07

1.922e+01 4.450e+01

1.037e−01 3.092e−01

4.712e−32 1.670e−47

F 13

mean std

7.850e−18 1.086e−18

6.083e−19 2.675e−19

2.950e−03 4.409e−03

3.510e+08 1.655e+08

4.519e+03 1.442e+04

1.225e−01 6.563e−01

1.350e−32 5.567e−48

18 2

18 2

26 3

36 6

30 5

27 4

13 1

Friedman value Friedman rank

Table 6 Comparison of HGSA with some optimization algorithms for the test functions F 14–F 23. FA

GSA

MVO

BA

HS

MFO

HGSA

F 14

mean std

1.162e+00 9.002e−01

3.646e+00 2.771e+00

9.980e−01 1.311e−11

1.572e+01 8.279e+00

3.704e+00 3.129e+00

2.707e+00 2.441e+00

2.217e+00 1.768e+00

F 15

mean std

7.383e−04 1.670e−04

2.372e−03 1.607e−03

5.935e−03 8.851e−03

4.756e−02 4.458e−02

7.268e−03 8.476e−03

1.757e−03 3.536e−03

7.380e−04 2.317e−04

F 16

mean std

−1.032e+00 4.879e−16

−1.032e+00 5.376e−16

−1.032e+00 1.733e−07

−1.032e+00 4.659e−01

−1.032e+00 7.476e−05

−1.032e+00 6.775e−16

−1.032e+00 5.684e−16

F 17

mean std

– –

3.979e−01 0.000e+00

3.979e−01 1.453e−07

5.764e−01 1.214e−03

3.980e−01 3.979e−01

3.979e−01 0.000e+00

3.979e−01 0.000e+00

F 18

mean std

3.000e+00 2.177e−15

3.000e+00 3.475e−15

3.000e+00 9.700e−07

1.511e+01 2.037e+01

1.226e+01 1.262e+01

3.000e+00 2.202e−15

3.000e+00 2.960e−15

F 19

mean std

−3.863e+00 2.257e−15

−3.863e+00 2.372e−15

−3.863e+00 4.633e−07

−3.513e+00 3.450e−01

−3.863e+00 8.429e−04

−3.863e+00 1.439e−03

−3.863e+00 2.452e−15

F 20

mean std

−3.263e+00 6.046e−02

−3.322e+00 1.473e−15

−3.278e+00 5.879e−02

−1.928e+00 5.604e−01

−3.280e+00 4.707e−02

−3.233e+00 8.282e−02

−3.274e+00 5.924e−02

F 21

mean std

−8.710e+00 2.535e+00

−6.143e+00 3.622e+00

−7.115e+00 2.769e+00

−7.949e−01 5.465e−01

−4.504e+00 3.182e+00

−5.972e+00 3.173e+00

−8.733e+00 2.684e+00

F 22

mean std

−9.194e+00 2.758e+00

−9.993e+00 1.563e+00

−8.582e+00 2.896e+00

−1.014e+00 5.970e−01

−5.670e+00 3.639e+00

−7.453e+00 3.486e+00

−1.040e+01 1.043e−15

F 23

mean std

−9.911e+00 1.919e+00

−1.019e+01 1.405e+00

−9.109e+00 2.691e+00

−1.239e+00 6.104e−01

−5.218e+00 3.561e+00

−7.520e+00 3.773e+00

−9.924e+00 1.902e+00

26 4

21 2

23 3

50 7

41 6

29 5

16 1

Friedman value Friedman rank

large numbers of local optimum. These functions are shown in Table 2. The comparison results between HGSA and the other algorithms are shown in Table 2. Except for the benchmark functions F 8, F 9 and F 11, the proposed algorithm achieved the best results. In particular, on multimodal functions F 12 and F 13, HGSA improves the second-best competitor algorithm (GSA) of 13 order of magnitude. Moreover, HGSA is the second-best algorithm for the functions F 9 and F 11. Therefore, the proposed algorithm achieves globally the best results with respect to the competitor algorithms. 4.3. Comparison of HGSA for low dimension multimodal functions The multimodal functions with low dimensions are suitable to assess the algorithms robustness. They are referred as F 14–F 23 in Table 3. The simulation results are shown in Table 3. Note that, HGSA is an efficient optimizer for F 15–F 19, F 21 and F 22 and very competitive in other test cases. As an example, on benchmark function F 23, HGSA is the second-best algorithm.

Table 7 Overall Friedman rank statistical results. Friedman value Friedman rank

FA

GSA

MVO

BA

HS

MFO

HGSA

65 3

57 2

74 4

134 7

112 6

83 5

39 1

4.4. Comparison of HGSA on CEC-06 2019 benchmark functions In order to assess HGSA on recent benchmark functions, the proposed algorithm is tested on CEC − 06 2019 single objective optimization benchmark functions referred with CEC 01, CEC 02, . . . , CEC 10. Table 8 shows the best solutions obtained by the competitor algorithms and HGSA in terms of average and standard deviation. Note that, the proposed algorithm is the best algorithm for the test functions CEC 03, CEC 05, CEC 06, CEC 07. Moreover, HGSA is the best-second optimizer on the benchmark functions CEC 08, CEC 09 and CEC 10. Competitive results are achieved by HGSA on the test functions CEC 01, CEC 02, CEC 04. Finally, the results of Table 8 show the goodness of the proposed algorithm.

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

8

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

Table 8 Results obtained by FA, GSA, MVO, BA, MFO and HGSA on CEC-06 2019 benchmark functions [59]. FA

GSA

MVO

BA

HS

MFO

HGSA

CEC 01

mean std

4.303e+07 3.426e+07

3.783e+12 2.386e+12

2.407e+09 1.930e+09

3.456e+12 3.578e+12

1.801e+11 1.519e+11

2.082e+10 4.532e+10

1.666e+11 1.927e+11

CEC 02

mean std

1.734e+01 6.382e−09

1.608e+04 4.037e+03

1.802e+01 3.497e−01

1.370e+04 3.950e+03

1.469e+03 8.002e+02

1.734e+01 7.227e−15

2.888e+01 6.193e+01

CEC 03

mean std

1.270e+01 3.761e−15

1.270e+01 3.613e−15

1.270e+01 1.370e−09

1.270e+01 1.358e−03

1.270e+01 2.398e−05

1.270e+01 2.832e−05

1.270e+01 3.613e−15

CEC 04

mean std

5.671e+00 2.204e+00

7.330e+00 3.481e+00

2.723e+01 9.889e+00

2.371e+04 8.479e+03

6.641e+02 3.712e+02

4.212e+02 7.114e+02

7.529e+00 3.580e+00

CEC 05

mean std

1.034e+00 1.980e−02

1.018e+00 1.146e−02

1.274e+00 1.246e−01

6.279e+00 1.577e+00

1.920e+00 1.657e−01

1.221e+00 2.493e−01

1.017e+00 1.634e−02

CEC 06

mean std

6.860e+00 3.178e+00

1.002e+00 3.015e−05

6.947e+00 1.345e+00

1.313e+01 1.184e+00

1.249e+01 7.925e−01

6.122e+00 1.914e+00

1.002e+00 1.055e−02

CEC 07

mean std

1.955e+02 1.453e+02

2.434e+02 9.519e+01

2.393e+02 1.901e+02

1.591e+03 3.815e+02

9.328e+02 2.899e+02

3.475e+02 2.226e+02

1.583e+02 7.366e+01

CEC 08

mean std

2.363e+00 7.911e−01

5.308e+00 4.715e−01

5.129e+00 1.085e+00

7.663e+00 4.638e−01

6.740e+00 4.208e−01

5.298e+00 7.781e−01

5.073e+00 5.920e−01

CEC 09

mean std

2.338e+00 8.359e−05

3.039e+00 3.229e−01

2.401e+00 3.410e−02

4.354e+03 1.545e+03

7.585e+01 5.868e+01

2.745e+00 2.399e−01

2.382e+00 2.905e−02

CEC 10

mean std

1.553e+01 8.714e+00

1.864e+01 5.068e+00

2.005e+01 4.061e−02

2.089e+01 1.704e−01

2.062e+01 1.624e−01

2.014e+01 1.608e−01

1.799e+01 6.098e+00

15 1

35 5

31 3

60 7

51 6

34 4

21 2

Friedman value Friedman rank

4.5. Statistical testing The comparison of algorithms through mean and standard deviation does not guarantee the efficiency and superiority of the proposed algorithm. Generally, comparing algorithms over 30 independent runs does not compare each of the runs because it is still possible that the superiority occurs by chance despite its low probability in 30 runs. Therefore, to compare the results of each run and decide on the significance of the results, the Friedman rank test [60,61] is performed. Observing the results for each class of test functions in Tables 4, 5, 6, note that HGSA is at the first position in the Friedman rank test. The overall results on all the benchmark functions (see Table 7) show that HGSA is significantly superior than other algorithms. The Friedman rank test is also applied on CEC –06 2009 benchmark functions (see Table 8). The results show that the proposed algorithm is the second-best algorithm.

exploration and exploitation is achieved. CEC 07 convergence curve shows that HGSA converges to the optimal value during the early stage of optimization. This means that the proposed algorithm provides an excellent exploration. In view of the above, the convergence analysis confirms the superiority of HGSA with respect to the comparison algorithms. 5. HGSA for real-life problems In order to show further the efficiency of the proposed algorithm, HGSA is applied to a real-life problem. In particular, a constrained non-linear engineering design problem is chosen: the three-bar truss design problem. In this problem, the volume of the truss structure has to be minimized subject to stress constrain. The design problem is characterized by two variables and three constraints. Let x1 , x2 ∈ [0, 1] be the parameters to optimize, with x1 = A1 and x2 = A2 . The goal is to minimize the function

( √

4.6. Convergence analysis In order to understand the behavior of HGSA, a convergence analysis is needed. The Figs. 2–7 show the convergence curves of test functions F 1, F 3, F 4, F 10, F 12 and F 13, respectively. Note that, in the initial stage of optimization process, the curve changes rapidly. Moreover, the HGSA curve slope for t ≤ 500 is less than curve slope when t > 500 (remind that t represents the tth iteration). This behavior means that HGSA has good exploration when t ≤ 500 and suitable exploitation for t > 500. The optimal value is achieved by HGSA within the last 100 iterations. These convergence behaviors reveal that HGSA provides a good balance between exploration and exploitation. Observing the Figs. 2–7, it follows that the proposed algorithm converges more quickly than other algorithms and provides the best minimum. The algorithms convergence curves on the benchmark functions CEC 06 and CEC 07 are shown in Figs. 8 and 9 respectively. For the test function CEC 06, both the algorithms GSA and HGSA provide the same optimal solution. However, HGSA converges more quickly to the same minimum than GSA (see Fig. 8). In particular, HGSA improves GSA over the iterations range [415, 660]. Such convergence behavior means that a good trade-off between

)

f (x) = 2 2x1 + x2 l subject to the constraints

√ g1 (x) = √

g2 (x) = √

2x1 + x2 2x21 + 2x1 x2 x2 2x21

+ 2x1 x2

P −σ ≤0 P −σ ≤0

1

P −σ ≤0 2x2 + x1 where l = 100 cm, P = 2 KN/cm2 , σ = 2 KN/cm2 . HGSA algorithm is compared (as for benchmark functions F1–F23) with FA, GSA, MVO, BA, HS and MFO. The algorithms parameters are: number of agents equal to 30 and number of iterations equal to 1000. The comparison for best solution between HGSA and competitor algorithms is shown in Table 9. Among algorithms, the proposed HGSA gives optimal solution at A1 = 0.788649, A2 = 0.408235 with corresponding fitness value equal to 263.891491. The results show that either HGSA and FA are the best algorithm for three-bar truss problem design. The statistical comparison of g3 (x) = √

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

9

Fig. 2. Best-so-far solution trend for function F1.

Fig. 3. Best-so-far solution trend for function F3.

Fig. 4. Best-so-far solution trend for function F4.

HGSA with other algorithms is shown in Table 10. The Fig. 10 shows the convergence curve of the best solution obtained by HGSA. Note that, HGSA converges quickly to the optimal value in the early stage. 6. Conclusions A Hyperbolic functions-based Gravitational Search Algorithm able of finding an optimal trade off between exploration and exploitation has been designed. For this task, three new contributions are introduced. The first one is the use of a hyperbolic function to define accelerator coefficients. The dynamic regulation of the gravitational constant coefficient by means of an

Table 9 Comparison results for three-bar truss problem. HGSA MFO HS BA MVO GSA FA

A1

A2

Optimum cost

0.788649 0.788530 0.788267 0.793637 0.788699 0.788654 0.788649

0.408235 0.408573 0.409290 0.394306 0.408096 0.408230 0.408235

263.891491 263.891502 263.891995 263.909464 263.891495 263.891529 263.891491

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

10

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

Fig. 5. Best-so-far solution trend for function F10.

Fig. 6. Best-so-far solution trend for function F12.

Fig. 7. Best-so-far solution trend for function F13. Table 10 Statistical results obtained by HGSA and other algorithms for three-bar truss design problem. HGSA MFO HS BA MVO GSA FA

increasing hyperbolic function is the second novelty. The third contribution is the new definition of parameter Kbest as de-

min

mean

worst

std

creasing hyperbolic function. The proposed algorithm has been

263.891491 263.891502 263.891995 263.909464 263.891495 263.891529 263.891491

263.892221 263.931887 264.424404 267.960654 263.891720 263.942693 263.891491

263.898746 264.151920 265.635091 279.006503 263.893792 264.150253 263.891491

0.001778 0.070307 0.506609 4.570163 0.000415 0.074134 0.000000

compared with other optimization algorithms on 23 well-known benchmark functions, CEC -06 2019 test functions and three-bar truss design problem. The results show that HGSA performs better than other algorithms in terms of global optimum research and convergence performances. Future research works will focus on the design of a multi-objective gravitational search algorithm for engineering problems.

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

11

Fig. 8. Best-so-far solution trend for function CEC06.

Fig. 9. Best-so-far solution trend for function CEC07.

Fig. 10. Convergence analysis of HGSA for three-bar truss design problem.

CRediT authorship contribution statement

Danilo Pelusi: Investigation, Supervision, Writing - review & editing. Raffaele Mascella: Data curation. Luca Tallini: Formal analysis. Janmenjoy Nayak: Validation. Bighnaraj Naik: Validation. Yong Deng: Validation.

Acknowledgment This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. References [1] D.H. Wolpert, W.G. Macready, No free lunch theorems for optimization,

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

12

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx

IEEE Trans. Evol. Comput 1 (1997) 67–82. [2] E. Rashedi, H. Nezamabadi-pour, S. Saryazdi, GSA: A gravitational search algorithm, Inform. Sci. 179 (2009) 2232–2248. [3] Ling-Ling Li, Guo-Qian Lin, Ming-Lang Tseng, Kimhua Tan, Ming K. Lim, A maximum power point tracking method for PV system with improved gravitational search algorithm, Appl. Soft Comput. 65 (2018) 333–348. [4] A. Mahanipour, H. Nezamabadi-pour, A multiple feature construction method based on gravitational search algorithm, Expert Syst. Appl. 127 (2019) 199–209. [5] D. Chaudhary, B. Kumar, Cost optimized hybrid genetic-gravitational search algorithm for load scheduling in cloud computing, Appl. Soft Comput. J. 83 (2019) 105627. [6] T. Biswas, P. Kuil, A.K. Ray, M. Sarkara, Gravitational search algorithm based novel workflow scheduling for heterogeneous computing systems, Simul. Model. Pract. Theory 96 (2019) 101932. [7] M. Khatibinia, H. Yazdani, Accelerated multi-gravitational search algorithm for size optimization of truss structures, Swarm Evol. Comput. 38 (2018) 109–119. [8] O. Ozkaraca, A. Kecebas, Performance analysis and optimization for maximum exergy efficiency of a geothermal power plant using gravitational search algorithm, Energy Convers. Manage. 185 (2019) 155–168. [9] W. Niu, Z. Feng, M. Zeng, B. Feng, Y. Min, C. Cheng, J. Zhou, Forecasting reservoir monthly runoff via ensemble empirical mode decomposition and extreme learning machine optimized by an improved gravitational search algorithm, Appl. Soft Comput. J. 82 (2019) 105589. [10] S. Derafshi Beigv, H. Abdi, M. La Scala, Combined heat and power economic dispatch problem using gravitational search algorithm, Electr. Power Syst. Res. 133 (2016) 160–172. [11] S. Ozyon, C. Yasar, Gravitational search algorithm applied to fixed head hydrothermal power system with transmission line security constraints, Energy 155 (2018) 392–407. [12] R. Moeini, M. Soltani-nezhad, M. Daei, Constrained gravitational search algorithm for large scale reservoir operation optimization problem, Eng. Appl. Artif. Intell. 62 (2017) 222–233. [13] V. Kumar, J.K. Chhabra, D. Kumar, Automatic cluster evolution using gravitational search algorithm and its application on image segmentation, Eng. Appl. Artif. Intell. 29 (2014) 93–103. [14] T. Chakraborti, K.D. Sharma, A. Chatterjee, A novel local extrema based gravitational search algorithm and its application in face recognition using one training image per class, Eng. Appl. Artif. Intell. 34 (2014) 13–22. [15] K. Kang, C. Baeb, Y.Y. Yeung H.W.F, A hybrid gravitational search algorithm with swarm intelligence and deep convolutional feature for object tracking optimization, Appl. Soft Comput. 66 (2018) 319–329. [16] S. Mirjalili, S.Z.M. Hashim, A new hybrid PSOGSA algorithm for function optimization, in: Proc. of International Conference on Computer and Information Application, 2010, pp. 374–377. [17] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in: Proceedings of IEEE International Conference on Neural Networks, Vol. 4, 1995, pp. 1942-1948. [18] S. Mirjalili, A. Lewis, Adaptive gbest-guided gravitational search algorithm, Neural Comput. Appl. 25 (7–8) (2014) 1569–1584. [19] S. Darzi, T.S. Kiong, M.T. Islam, H.R. Soleymanpour, S. Kibria, A memorybased gravitational search algorithm for enhancing minimum variance distortionless response beamforming, Appl. Soft Comput. 47 (2016) 103–118. [20] V.K. Bohat, K.V. Arya, An effective gbest-guided gravitational search algorithm for real-parameter optimization and its application in training of feedforward neural networks, Knowl.-Based Syst. 143 (2018) 192–207. [21] J. Xiao, Y. Niu, P. Chen, S.C.H. Leung, F. Xing, An improved gravitational search algorithm for green partner selection in virtual enterprises, Neurocomputing 217 (2016) 103–109. [22] X.H. Han, L. Quan, X.Y. Xiong, M. Almeter, J. Xiang, Y. Lan, A novel data clustering algorithm based on modified gravitational search algorithm, Eng. Appl. Artif. Intell. 61 (2017) 1–7. [23] B. Yin, Z. Guo, Z. Liang, X. Yue, Improved gravitational search algorithm with crossover, Comput. Electr. Eng. 66 (2018) 505–516. [24] G. Sun, A. Zhang, X. Jia, X. Li, S. Ji, Z. Wang, DMMOGSA: Diversity-enhanced and memory-based multi-objective gravitational search algorithm, Inform. Sci. 363 (2016) 52–71. [25] C. Li, J. Zhou, J. Xiao, H. Xiao, Parameters identification of chaotic system by chaotic gravitational search algorithm, Chaos Solitons Fractals 45 (2012) 539–547. [26] F. Zhao, F. Xue, Y. Zhang, W. Ma, C. Zhang, H. Song, A hybrid algorithm based on self-adaptive gravitational search algorithm and differential evolution, Expert Syst. Appl. 113 (2018) 515–530. [27] R. Storn, K. Price, Differential evolution - A simple and efficient heuristic for global optimization over continuous spaces, J. Global Optim. 11 (1997) 341–359. [28] G. Sun, P. Ma, J. Ren, A. Zhang, X. Jia, A stability constrained adaptive alpha for gravitational search algorithm, Knowl.-Based Syst. 139 (2018) 200–213.

[29] P.K. Das, H.S. Behera, B.K. Panigrahi, A hybridization of an improved particle swarm optimization and gravitational search algorithm for multi-robot path planning, Swarm Evol. Comput. 28 (2016) 14–28. [30] N. Zhang, C. Li, R. Li, X. Lai, Y. Zhang, A mixed-strategy based gravitational search algorithm for parameter identification of hydraulic turbine governing system, Knowl.-Based Syst. 109 (2016) 218–237. [31] X.H. Han, L. Quan, X.Y. Xiong, B. Wu, Facing the classification of binary problems with a hybrid system based on quantum-inspired binary gravitational search algorithm and K-NN method, Eng. Appl. Artif. Intell. 26 (2013) 2424–2430. [32] C. Li, H. Li, P. Kou, Piecewise function based gravitational search algorithm and its application on parameter identification of AVR system, Neurocomputing 124 (2012) 139–148. [33] J. Ji, S. Gao, S. Wang, Y. Tang, H. Yu, Y. Todo, Self-adaptive gravitational search algorithm with a modified chaotic local search, IEEE Access 5 (2017) 17881–17895. [34] S. Jiang, Y. Wang, Z. Ji, Convergence analysis and performance of an improved gravitational search algorithm, Appl. Soft Comput. 24 (2014) 363–384. [35] D. Shen, T. Jiang, W. Chen, Q. Shi, S. Ga, Improved chaotic gravitational search algorithms for global optimization, in: Proc. of IEEE Congress on Evolutionary Computation, 2015, pp. 1220-1226. [36] S. Mirjalili, A.H. Gandomi, Chaotic gravitational constants for the gravitational search algorithm, Appl. Soft Comput. 53 (2017) 407–419. [37] Z. Su, H. Wang, A novel robust hybrid gravitational search algorithm for reusable launch vehicle approach and landing trajectory optimization, Neurocomputing 162 (2015) 116–127. [38] M. Wang, Y. Wan, Z. Ye, X. Gao, X. Lai, A band selection method for airborne hyperspectral image based on chaotic binary coded gravitational search algorithm, Neurocomputing 273 (2018) 57–67. [39] X.H. Han, X.M. Chang, L. Quan, X.Y. Xiong, J.X. Li, Z.X. Zhang, Y. Liu, Feature subset selection by gravitational search algorithm optimization, Inform. Sci. 281 (2014) 128–146. [40] N. Gouthamkumar, V. Sharma, R. Naresh, Disruption based gravitational search algorithm for short term hydrothermal scheduling, Expert Syst. Appl. 42 (2015) 7000–7011. [41] C. Li, L. Chang, Z. Huang, Y. Liu, N. Zhang, Parameter identification of a non linear model of hydraulic turbine governing system with an elastic water hammer based on a modified gravitational search algorithm, Eng. Appl. Artif. Intell. 50 (2016) 177–191. [42] M. Minarova, D. Paternain, A. Jurio, J. Ruiz-Aranguren, Z. Takac, H. Bustince, Modifying the gravitational search algorithm: A functional study, Inform. Sci. 430–431 (2018) 87–103. [43] C. Li, N. Zhang, X. Lai, J. Zhou, Y. Xu, Design of a fractional-order PID controller for a pumped storage unit using a gravitational search algorithm based on the Cauchy and Gaussian mutation, Inform. Sci. 396 (2017) 162–181. [44] G. Sun, A. Zhang, Z. Wang, Y. Yao, J. Ma, G.D. Couples, Locally informed gravitational search algorithm, Knowl.-Based Syst. 104 (2016) 134–144. [45] A. Zhang, G. Sun, J. Ren, X. Li, Z. Wang, X. Jia, A dynamic neighborhood learning-based gravitational search algorithm, IEEE Trans. Cybern. 48 (1) (2018) 436–447. [46] P. Haghbayan, H. Nezamabadi-pour, S. Kamyab, A niche GSA method with nearest neighbor scheme for multimodal optimization, Swarm Evol. Comput. 35 (2017) 78–92. [47] R.K. Ursem, Multinational evolutionary algorithms, in: Proceedings Congr. Evol. Comput. (CEC), Vol. 3, 1999, pp. 1633-1640. [48] S. Yazdani, H. Nezamabadi-pour, S. Kamya, A gravitational search algorithm for multimodal optimization, Swarm Evol. Comput. 14 (2014) 1–14. [49] H. Mittal, R. Pal, A. Kulhari, M. Saraswat, Chaotic Kbest gravitational search algorithm (CKGSA), Proc. of International Conference on Contemporary Computing (IC3), 2016. [50] H. Mittal, M. Saraswat, An optimum multi-level image thresholding segmentation using non-local means 2D histogram and exponential kbest gravitational search algorithm, Eng. Appl. Artif. Intell. 71 (2018) 226–235. [51] D. Pelusi, R. Mascella, L. Tallini, J. Nayak, B. Naik, A. Abraham, Neural network and fuzzy system for the tuning of gravitational search algorithm parameters, Expert Syst. Appl. 102 (2018) 234–244. [52] G. Dhiman, V. Kumar, Emperor penguin optimizer: A bio-inspired algorithm for engineering problems, Knowl.-Based Syst. 159 (2018) 20–50. [53] G. Dhiman, V. Kumar, Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems, Knowl.-Based Syst. 165 (2019) 169–196. [54] X.S. Yang, Firefly algorithm, stochastic test functions and design optimization, Int. J. Bio-Inspired Comput. 2 (2) (2010) 78–84. [55] S. Mirjalili, S.M. Mirjalili, A. Hatamlou, Multi-verse optimizer: a natureinspired algorithm for global optimization, Neural Comput. Appl. 27 (2) (2016) 495–513. [56] X.S. Yang, A new metaheuristic bat-inspired algorithm, in: J.R. Gonzalez, et al. (Eds.), Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), Springer, Berlin, 2010, pp. 65–74.

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.

D. Pelusi, R. Mascella, L. Tallini et al. / Knowledge-Based Systems xxx (xxxx) xxx [57] Z.W. Geem, J.H. Kim, G.V. Loganathan, A new heuristic optimization algorithm: harmony search, Simulation 76 (2) (2001) 60–68. [58] G. Dhiman, V. Kumar, Multi-objective spotted hyena optimizer: A multiobjective optimization algorithm for engineering problems, Knowl.-Based Syst. 150 (2018) 175–197. [59] K.V. Price, N.H. Awad, M.Z. Ali, P.N. Suganthan, Problem Definitions and Evaluation Criteria for the 100-Digit Challenge Special Session and Competition on Single Objective Numerical Optimization, Technical Report, Nanyang Technological University, Singapore, 2018.

13

[60] R.V. Hogg, E.A. Tanis, Probability and Statistical Inference, seventh ed., Prentice Hall, 2006. [61] W. Wayne, Friedman Two-Way Analysis of Variance By Ranks, in: Applied Non-parametric Statistics, PWS-Kent, Boston, 1990.

Please cite this article as: D. Pelusi, R. Mascella, L. Tallini et al., Improving exploration and exploitation via a Hyperbolic Gravitational Search Algorithm, Knowledge-Based Systems (2019) 105404, https://doi.org/10.1016/j.knosys.2019.105404.