Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

A Novel Hybrid Bat Algorithm with Harmony Search for Global Numerical Optimization

A Novel Hybrid Bat Algorithm with Harmony Search for Global Numerical Optimization Hindawi Publishing Corporation Journal of Applied Mathematics Volume 2013, Article ID 696491, 21 pages http://dx.doi.org/10.1155/2013/696491 Research Article A Novel Hybrid Bat Algorithm with Harmony Search for Global Numerical Optimization 1,2 1 Gaige Wang and Lihong Guo Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China Graduate School of Chinese Academy of Sciences, Beijing 100039, China Correspondence should be addressed to Lihong Guo; guolh@ciomp.ac.cn Received 27 June 2012; Revised 27 November 2012; Accepted 15 December 2012 Academic Editor: Marco H. Terra Copyright © 2013 G. Wang and L. Guo. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A novel robust hybrid metaheuristic optimization approach, which can be considered as an improvement of the recently developed bat algorithm, is proposed to solve global numerical optimization problems. The improvement includes the addition of pitch adjustment operation in HS serving as a mutation operator during the process of the bat updating with the aim of speeding up convergence, thus making the approach more feasible for a wider range of real-world applications. eTh detailed implementation procedure for this improved metaheuristic method is also described. Fourteen standard benchmark functions are applied to verify the eeff cts of these improvements, and it is demonstrated that, in most situations, the performance of this hybrid metaheuristic method (HS/BA) is superior to, or at least highly competitive with, the standard BA and other population-based optimization methods, such as ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. eTh eeff ct of the HS/BA parameters is also analyzed. 1. Introduction for making balance between randomization (global search) and local search [2]. The process of optimization is searching a vector in a function Inspired by nature, these strong metaheuristic algorithms that produces an optimal solution. All of feasible values are applied to solve NP-hard problems, such as UCAV are available solutions, and the extreme value is optimal path planning [11–14], test-sheet composition [15], WSN solution. In general, optimization algorithms are applied to deployment [16] and water, and geotechnical and transport solve these optimization problems. A simple classification engineering [17]. Optimization algorithms cover all searching way for optimization algorithms is considering the nature for extreme value problems. These kinds of metaheuristic of the algorithms, and optimization algorithms can be algorithms carry out on a population of solutions and always divided into two main categories: deterministic algorithms find best solutions. During the 1950s and 1960s, computer sci- and stochastic algorithms. Deterministic algorithms using entists studied the possibility of conceptualizing evolution as gradients such as hill climbing have a rigorous move and an optimization tool, and this generated a subset of gradient will generate the same set of solutions if the iterations free approaches named genetic algorithms (GA) [18]. Since commence with the same initial starting point. On the other then, many other nature-inspired metaheuristic algorithms hand, stochastic algorithms without using gradients oen ft have emerged, such as differential evolution (DE) [ 10, 19, 20], generate different solutions even with the same initial value. particle swarm optimization (PSO) [21–23], biogeography- However, generally speaking, the na fi l values, though slightly based optimization (BBO) [24, 25], harmony search (HS) different, will converge to the same optimal solutions within a [26, 27], and more recently, the bat algorithm (BA) [28, 29] given accuracy [1]. Generally, stochastic algorithms have two that is inspired by echolocation behavior of bats in nature. types: heuristic and metaheuristic. Recently, nature-inspired metaheuristic algorithms perform powerfully and efficiently Firstly presented by Yang in 2010, the bat-inspired algo- in solving modern nonlinear numerical global optimization rithm or bat algorithm (BA) [30, 31]isametaheuristic problems. To some extent, all metaheuristic algorithms strive search algorithm, inspired by the echolocation behavior of 2 Journal of Applied Mathematics bats with varying pulse rates of emission and loudness. eTh 2. Preliminary primary purpose for a bat’s echolocation is to act as a signal To begin with, in this section we will provide a brief back- system to sense distance and hunt for food/prey. A com- ground on the optimization problem, harmony search (HS), prehensive review about swarm intelligence involving BA is and bat algorithm (BA). performed by Parpinelli and Lopes [32]. Furthermore, Tsai et al. proposed an evolving bat algorithm (EBA) to improve the 2.1. Optimization Problem. In computer science, mathemat- performance of BA with better efficiency [ 33]. ics, management science, and control theory, optimization FirstlyproposedbyGeemetal. in 2001,harmony search (also called mathematical optimization or mathematical pro- (HS) [26] is a new metaheuristic approach for minimizing gramming) means the selection of an optimal solution from possibly nondifferentiable and nonlinear functions in con- some set of feasible alternatives. In general, an optimization tinuous space. HS is inspired by the behavior of a musician’s problem includes minimizing or maximizing a function by improvisation process, where every musician attempts to systematically selecting input values from a given feasible improve its tune so as to create optimal harmony in a real- set and calculating the value of the function. More gener- world musical performance processes. HS algorithm origi- ally, optimization consists of ndin fi g the optimal values of nates in the similarity between engineering optimization and some objective function within a given domain, including a music improvisation, and the engineers search for a global number of different types of domains and different types of optimal solution as determined by an objective function, objective functions [34]. just like the musicians strive for ndin fi g aesthetic harmony A global optimization problem can be described as as determined by aesthetician. In music improvisation, each follows. musician chooses any pitch within the feasible range, together producing one harmony vector. If all the pitches produce a Given: a function 𝑓 :𝐷→𝑅 from some set 𝐷 to the good solution, that experience is reserved in each variable’s real numbers. memory, and the possibility of producing a better solution Sought: a parameter 𝑥 in 𝐷 such that 𝑓(𝑥 )≤𝑓(𝑥) is also increased next time. Furthermore, this new approach 0 0 for all 𝑥 in 𝐷 (“minimization”) or such that 𝑓(𝑥 )≥ requires few control variables, which makes HS easy to 0 𝑓(𝑥) for all 𝑥 in 𝐷 (“maximization”). implement, more robust, and very appropriate for parallel computation. Such a formulation is named a numerical optimization BA is a powerful algorithm in exploitation (i.e., local problem. Many theoretical and practical problems may be search), but at times it may trap into some local optima, so modeled in this general framework. In general, 𝐷 is some that it cannot perform global search well. For bat algorithm, 𝑛 subset of the Euclidean space 𝑅 , oeft n specified by a group the search depends completely on random walks, so a fast of constraints, equalities, or inequalities that the components convergence cannot be guaranteed. Firstly presented here, in of 𝐷 have to satisfy. The domain 𝐷 of 𝑓 is named the search order to increase the diversity of the population for BA so space, while the elements of 𝐷 are named feasible solutions as to avoid trapping into local optima, a main improvement or candidate solutions. In general, the function 𝑓 is called of adding pitch adjustment operation in HS serving as a an objective function, cost function (minimization), or utility mutation operator is made to the BA with the aim of speeding function (maximization). An optimal solution is an available up convergence, thus making the approach more feasible solution that is the extreme of (minimum or maximum) the for a wider range of practical applications while preserving objective function. the attractive characteristics of the basic BA. aTh t is to Conventionally, the standard formulation of an optimiza- say, we combine two approaches to propose a new hybrid tion problem is stated in terms of minimization. In general, metaheuristic algorithm according to the principle of HS and unless both the feasible region and the objective function are BA, and then this improved BA method is used to search convex in a minimization problem, there may be more than theoptimal objectivefunctionvalue.Theproposedapproach ∗ one local minima. A local minimum 𝑥 is defined as a point is evaluated on fourteen standard benchmark functions that for which the following expression have ever been applied to verify optimization algorithms on continuous optimization problems. Experimental results 𝑓 (𝑥 )≤𝑓 (𝑥 ) (1) show that the HS/BA performs more ecffi iently and accu- rately than basic BA, ACO, BBO, DE, ES, GA, HS, PSO, and holds. More details about the optimization problem can be SGA. found in [35]. eTh structureofthispaper is organizedasfollows: The branch of numerical analysis and applied mathe- Section 2 describes global numerical optimization problem, matics that investigates deterministic algorithms that can the HS algorithm, and basic BA in brief. Our proposed guarantee convergence in limited time to the true optimal approach HS/BA is presented in detail in Section 3.Subse- solution of a nonconvex problem is called global numer- quently, our method is evaluated through fourteen bench- ical optimization problems. A variety of algorithms have mark functions in Section 4. In addition, the HS/BA is been proposed to solve nonconvex problems. Among them, also compared with BA,ACO,BBO,DE, ES,GA, HS,PSO heuristics algorithms can evaluate approximate solutions to and SGA. Finally, Section 5 consists of the conclusion and some optimization problems, as described in introduction proposals for future work. [36]. Journal of Applied Mathematics 3 2.2. Harmony Search. Firstly developed by Geem et al. in localrange of pitchadjustment. Actually,wecan seethatthe 2001, harmony search (HS) [26]isarelatively newmeta- pitch adjustment (3)isarandom walk. heuristic optimization algorithm, and it is based on natural Pitch adjustment is similar to the mutation operator in musical performance processes that occur when a musician GA. Also, we must appropriately set the parameter PAR to searches for an optimal state of harmony. eTh optimization control the degree of the adjustment. If PAR nears 1 (too operators of HS algorithm are speciefi d as the harmony high), then the solution is always changing and the algorithm memory (HM), which keeps the solution vectors which are may not converge at all. If it is very close to 0 (too low), then allwithinthe search space, as shownin(2); the harmony there is very little change and the algorithm may premature. memory size HMS, which represents the number of solution eTh refore, we use PAR = 0.1 ∼ 0.5 in most simulations as vectors kept in the HM; the harmony memory consideration usual. rate (HMCR); the pitch adjustment rate (PAR); the pitch For the purpose of increasing the diversity of the solu- adjustment bandwidth (bw) tions, the randomization is needed in the third component. Although adjusting pitch has a similar role, it is conn fi ed 1 1 1 1 𝑥 𝑥 ... 𝑥 fitness (𝑥 ) to certain local pitch adjustment and thus corresponds to 1 2 𝐷 [ ] 2 2 2 2 𝑥 𝑥 ... 𝑥 fitness (𝑥 ) a local search. eTh usage of randomization can make the [ ] 1 2 𝐷 [ ] HM = . (2) system move further to explore multifarious regions with . . . . [ ] . . . . [ ] . . ... . . high solution diversity in order to search for the global HMS HMS HMS HMS 𝑥 𝑥 ... 𝑥 fitness (𝑥 ) optimal solution. In real-world engineering applications, [ 1 2 𝐷 ] HS has been applied to solve many optimization problems In more detail, we can explain the HS algorithm with including function optimization, water distribution network, the help of discussing the improvisation process by a music groundwater modeling, energy-saving dispatch, structural player. When a player is improvising, he or she has three design, and vehicle routing. possible options: (1) play any well-known piece of music (a eTh mainframeofthe basicHSalgorithm canbe series of pitches in harmony) exactly from his or her memory described as showninAlgorithm 1,where 𝐷 is the number of as the harmony memory consideration rate (HMCR); (2) decision variables and rand is a random real number in inter- play something similar to a known piece in player’s memory val(0,1)drawnfromuniformdistribution.FromAlgorithm 1, (thus adjusting the pitch slightly); or (3) play totally new it is clear that there are only two control parameters in HS, or random pitch from feasible ranges. If these three options which are HMCR and PAR. are formalized for optimization, we have three correspond- ing components: employment of harmony memory, pitch 2.3. Bat Algorithm. The bat algorithm is a novel metaheuristic adjusting, and randomization. Similarly, when each decision swarm intelligence optimization method developed for the variable selects one value in the HS algorithm, it can make use global numerical optimization, in which the search algorithm of one of the above three rules in the whole HS procedure. If a is inspired by social behavior of bats and the phenomenon of new harmony vector is better than the worst harmony vector echolocation to sense distance. in the HM, the new harmony vector takes the place of the In [28], for simplicity, bat algorithm is based on idealizing worst harmony vector in the HM. This procedure is repeated some of the echolocation characteristics of bats, which are the until a stopping criterion is satisfied. following approximate or idealized rules: eTh employment of harmony memory is signicfi ant, as it is analogous to select the optimal tfi individuals in the GA (genetic algorithm). This will make sure that the best (1) all bats apply echolocation to sensedistance, andthey harmonies will be kept on to the new harmony memory. For always “know” thesurroundingsinsomemagical the purpose of using this memory more effectively, we should way; properly set the value of the parameter HMCR ∈ [0, 1]. If this rate is very approaching to 1 (too high), almost all (2) bats yfl randomly with velocity V and a xfi ed fre- the harmonies are utilized in the harmony memory, then quency 𝑓 at position 𝑥 , varying wavelength 𝜆 ,and min 𝑖 other harmonies are not explored well, resulting in potentially loudness 𝐴 to hunt for prey. They can spontaneously wrong solutions. If this rate is very close to 0 (extremely low), accommodate the wavelength (or frequency) of their only few best harmonies are chosen, and it may have a slow emittedpulsesand adjust therateofpulse emission convergence rate. Therefore, generally, HMCR = 0.7 ∼ 0.95. 𝑟∈[0,1] , depending on the proximity of their target; To adjust thepitch slightly in thesecondcomponent,an appropriate approach is to be applied to adjust the frequency (3) although the loudness can change in different ways, it ecffi iently. In theory, we can adjust the pitch linearly or is supposed that the loudness varies from a minimum nonlinearly, but in fact, linear adjustment is utilized. If 𝑥 is constant (positive) 𝐴 to a large 𝐴 . old min 0 the current solution (or pitch), then the new solution (pitch) 𝑥 is generated by new Based on these approximations and idealization, the basic steps of the bat algorithm (BA) can be described as shown in 𝑥 =𝑥 + bw (2𝜀 − 1 ), (3) new old Algorithm 2. 𝑡 𝑡 In BA,eachbat is denfi edbyits position 𝑥 ,velocity V , where 𝜀 is arandomrealnumberdrawn from auniform 𝑖 𝑖 𝑡 𝑡 distribution [0, 1].Herebwisthe bandwidth, controllingthe frequency 𝑓 , loudness 𝐴 , and the emission pulse rate 𝑟 𝑖 𝑖 4 Journal of Applied Mathematics Begin Step 1: Set the parameters and initialize the HM. Step 2: Evaluate the tfi ness for each individual in HM Step 3: while the halting criteria is not satisfied do for 𝑑=1 :𝐷 do if rand < HMCR then // memory consideration 𝑥 (𝑑 )=𝑥 (𝑑 )where 𝑎∈ (1,2,..., HMS) new 𝑎 if rand < PAR then // pitch adjustment 𝑥 (𝑑 )=𝑥 (𝑑 )+ bw × (2× rand −1) new old endif else // random selection 𝑥 (𝑑) = 𝑥 + rand ×(𝑥 −𝑥 ) new min,𝑑 max,𝑑 min−𝑑 endif endfor 𝑑 Update the HM as 𝑥 =𝑥 ,if 𝑓(𝑥 )<𝑓(𝑥 )(minimization objective) 𝑤 new new 𝑤 Update the best harmony vector found so far Step 4: end while Step 5: Post-processing the results and visualization. End. Algorithm 1: Harmony search algorithm. Begin Step 1: Initialization. Set the generation counter 𝑡=1 ; Initialize the population of NP bats 𝑃 randomly and each bat corresponding to a potential solution to the given problem; define loudness 𝐴 , pulse frequency 𝑄 𝑖 𝑖 and the initial velocities V (𝑖 = 1,2,...,NP);set pulserate 𝑟 . 𝑖 𝑖 Step 2: While the termination criteria is not satisfied or 𝑡< MaxGeneration do Generate new solutions by adjusting frequency, and updating velocities and locations/solutions [(4)–(6)] if (rand >𝑟 ) then Select a solution among the best solutions; Generate a local solution around the selected best solution end if Generate a new solution by yfl ing randomly if (rand <𝐴 & 𝑓(𝑥 )<𝑓(𝑥 )) then 𝑖 𝑖 ∗ Accept the new solutions Increase 𝑟 and reduce 𝐴 𝑖 𝑖 end if Rank the bats and find the current best 𝑥 𝑡=𝑡 + 1 ; Step 3: end while Step 4: Post-processing the results and visualization. End. Algorithm 2: Bat algorithm. in a 𝐷 -dimensional search space. The new solutions 𝑥 and all the 𝑛 bats. Generally speaking, depending on the domain size of the problem of interest, the frequency 𝑓 is assigned velocities V at time step t are given by to 𝑓 =0 and 𝑓 = 100 in practical implementation. min max 𝑓 =𝑓 +(𝑓 −𝑓 )𝛽, (4) 𝑖 min max min Initially, each bat is randomly given a frequency which is drawn uniformly from [𝑓 , 𝑓 ]. 𝑡 𝑡−1 𝑡 min max V = V +(𝑥 −𝑥 )𝑓 , (5) ∗ 𝑖 𝑖 𝑖 𝑖 For the local search part, once a solution is selected among the current best solutions, a new solution for each bat 𝑡 𝑡−1 𝑡 𝑥 =𝑥 + V , (6) 𝑖 𝑖 𝑖 is generated locally using random walk where 𝛽∈[0,1] is arandomvectordrawn from auniform distribution. Here 𝑥 is the current global best location (solu- tion) which is located after comparing all the solutions among 𝑥 =𝑥 +𝜀𝐴 , (7) new old Journal of Applied Mathematics 5 where 𝜀 ∈ [−1, 1] is a scaling factor which is a random num- from good solutions. Secondly, the mutation operator can ber, while 𝐴 =⟨𝐴 ⟩ is the average loudness of all the bats at improve the exploration of the new search space. In this way, time step 𝑡 . the strong exploration abilities of the original HS and the eTh updateofthevelocitiesandpositionsofbatsissimilar exploitation abilities of the BA can be fully developed. to the procedure in the standard PSO [21], as 𝑓 controls the For bat algorithm, as the search relies entirely on random pace and range of the movement of the swarming particles in walks, a fast convergence cannot be guaranteed. Described essence. To some degree, BA can be considered as a balanced here for the first time, a main improvement of adding combination of the standard PSO and the intensive local mutation operator is made to the BA, including three minor search controlled by the loudness and pulse rate. improvements, which are made with the aim of speeding up Furthermore, the loudness 𝐴 and the rate 𝑟 of pulse convergence, thus making the method more practical for a 𝑖 𝑖 emission update accordingly as the iterations proceed as wider range of applications, but without losing the attractive shown in (8) features of the original method. The first improvement is that we use fixed frequency 𝑡+1 𝑡 𝐴 =𝛼𝐴 , 𝑓 and loudness 𝐴 instead of various frequency 𝑓 and 𝐴 . 𝑖 𝑖 𝑖 𝑖 (8) Similar to BA, in HS/BA, each bat is defined by its position 𝑥 , 𝑡+1 0 𝑡 𝑡 𝑟 =𝑟 [1 − exp (−𝛾)] 𝑡 , 𝑖 𝑖 velocity V ,the emission pulserate 𝑟 and the xfi ed frequency 𝑖 𝑖 𝑓 , and loudness 𝐴 in a 𝑑 -dimensional search space. The new where 𝛼 and 𝛾 are constants. In essence, 𝛼 is similar to 𝑡 𝑡 solutions 𝑥 and velocities V at time step 𝑡 are given by 𝑖 𝑖 the cooling factor of a cooling schedule in the simulated annealing (SA) [37]. For simplicity, we set 𝛼=𝛾=0.9 in 𝑡 𝑡−1 𝑡 this work. V = V +(𝑥 −𝑥 )𝑓, 𝑖 𝑖 𝑖 (9) 𝑡 𝑡−1 𝑡 𝑥 =𝑥 + V , 3. Our Approach: HS/BA 𝑖 𝑖 𝑖 Basedonthe introduction of HS andBAinprevioussection, we will explain how we combine the two approaches to form where 𝑥 is the current global best location (solution) which the proposed bat algorithm with harmony search (HS/BA) in is located after comparing all the solutions among all the 𝑛 this section, which modifies the solutions with poor tfi ness in bats. In our experiments, we make 𝑓=0.5 .Throughaseries order to add diversity of the population to improve the search of simulation experiments on benchmarks in Section 4.2,it efficiency. was found that setting the parameter of pulse rate 𝑟 to 0.6 and In general, the standard BA algorithm is adept at exploit- the loudness 𝐴 to 0.95 produced the best results. ing the search space, but at times it may trap into some The second improvement is to add mutation operator in local optima, so that it cannot perform global search well. an attempt to increase diversity of the population to improve For BA, the search depends completely on random walks, so the search efficiency and speed up the convergence to optima. a fast convergence cannot be guaranteed. Firstly presented Forthelocalsearchpart,onceasolutionisselectedamongthe here, in order to increase the diversity of the population current best solutions, a new solution for each bat is generated for BA so as to avoid trapping into local optima, a main locally using random walk by (7)when 𝜉 is larger than pulse improvement of adding pitch adjustment operation in HS rate 𝑟 ,thatis, 𝜉>𝑟 ,where 𝜉∈[0,1] is a random real number serving as a mutation operator is made to the BA with the drawn from a uniform distribution; while when 𝜉≤𝑟 ,we aim of speeding up convergence, thus making the approach use pitch adjustment operation in HS serving as a mutation more feasible for a wider range of practical applications while operator updating the new solution to increase diversity of preserving the attractive characteristics of the basic BA. In the population to improve the search ecffi iency, as shown in this paper, a hybrid metaheuristic algorithm by inducing the (3). Through testing benchmarks in Section 4.2,itwas found pitch adjustment operation in HS as a mutation operator that setting the parameter of harmony memory consideration into bat algorithm, so-called harmony search/bat algorithm rate HMCR to 0.95 and pitch adjustment rate PAR to 0.1 (HS/BA), is used to optimize the benchmark functions. The produced the best results. difference between HS/BA and BA is that the mutation The last improvement is the addition of elitism scheme operator is used to improve the original BA generating a new into the HS/BA. As with other population-based optimiza- solution for each bat. In this way, this method can explore the tion algorithms, we typically incorporate some sort of elitism new search space by the mutation of the HS algorithm and in order to retain the best solutions in the population. This exploit the population information with BA, and therefore prevents the best solutions from being corrupted by pitch can avoid trapping into local optima in BA. In the following, adjustment operator. Note that we use an elitism approach to we will show the algorithm HS/BA which is an improvement save the property of the bats that has the best solution in the of HS and BA. HS/BAprocess,soevenifpitch adjustment operationruins eTh critical operator of HS/BA is the hybrid harmony its corresponding bat, we have saved it and can revert back to search mutation operator, which composes the improvisation it if needed. of harmony in HS with the BA. eTh core idea of the proposed Based on the above-mentioned analyses, the mainframe hybrid mutation operator is based on two considerations. of the harmony search/bat algorithm (HS/BA) is presented Firstly, poor solutions can take in many new used features in Algorithm 3. 6 Journal of Applied Mathematics Begin Step 1: Initialization. Set the generation counter 𝑡=1 ; initialize the population of NP bats 𝑃 randomly and each bat corresponding to a potential solution to the given problem; define loudness 𝐴 ; set frequency 𝑄 , the initial velocities V,and pulserate 𝑟 ; set the harmony memory consideration rate HMCR, the pitch adjustment rate PAR and bandwidth bw; set maximum of elite individuals retained KEEP. Step 2: Evaluate the quality 𝑓 for each bat in 𝑃 determined by the objective function 𝑓(𝑥) . Step 3: While the termination criteria is not satisfied or 𝑡< MaxGeneration do Sort the population of bats 𝑃 from best to worst by order of quality 𝑓 for each bat. Store the KEEP best bats as KEEPBAT. for 𝑖=1 :NP (all bats) do 𝑡 𝑡−1 𝑡 V = V +(V −𝑥 )×𝑄 𝑖 𝑖 𝑖 𝑡 𝑡−1 𝑡 𝑥 =𝑥 + V 𝑖 𝑖 𝑖 if (rand >𝑟 ) then 𝑡 𝑡 𝑥 =𝑥 +𝛼𝜀 end if for 𝑗=1 :𝐷 (all elements) do //Mutate if rand < HMCR then ( ) 𝑟 = ⌈NP ∗ rand⌉ 𝑥 (𝑗) = 𝑥 (𝑗)where 𝑟 ∈ 1,2,..., HMS ( ) V 𝑟 1 if (rand < PAR) then 𝑥 (𝑗) = 𝑥 (𝑗) +bw × (2× rand −1) V V endif else 𝑥 (𝑗) = 𝑥 + rand ×(𝑥 −𝑥 ) V min,𝑗 max,𝑗 min−𝑗 endif endfor 𝑗 𝑡 𝑡 𝑡 Evaluate the tn fi ess for the offsprings 𝑥 , 𝑥 , 𝑥 𝑢 𝑖 V 𝑡 𝑡 𝑡 𝑡 Select the offspring 𝑥 with the best tn fi ess among the offsprings 𝑥 , 𝑥 , 𝑥 . 𝑘 𝑢 𝑖 V if (rand <𝐴 ) then 𝑡 𝑡 𝑥 =𝑥 ; 𝑟 𝑘 end if Replace the KEEP worst bats with the KEEP best bats KEEPBAT stored. end for 𝑖 𝑡=𝑡 + 1 ; Step 4: end while Step 5: Post-processing the results and visualization; End. Algorithm 3: eTh hybrid meta-heuristic algorithm of HS/BA. 4. Simulation Experiments areACO,BA, BBO, DE,ES, GA,HS, PSO, andSGA.ACO (ant colony optimization) [38, 39] is a swarm intelligence In this section, we test the performance of the proposed meta- algorithm for solving optimization problems which is based heuristic HS/BA to global numerical optimization through a on the pheromone deposition of ants. BBO (biogeography- series of experiments conducted on benchmark functions. based optimization) [24]isanewexcellent powerfuland To allow a fair comparison of running times, all the exper- efficient evolution algorithm, developed for the global opti- iments were conducted on a PC with a Pentium IV processor mization inspired by the immigration and emigration of running at 2.0 GHz, 512 MB of RAM and a hard drive species between islands (or habitats) in search of more of 160 Gbytes. Our implementation was compiled using compatible islands. DE (differential evolution) [ 19]isasimple MATLAB R2012a (7.14) running under Windows XP3. No but excellent optimization method that uses the dieff rence commercial BA or HS tools were used in the following between two solutions to probabilistically adapt a third experiments. solution. An ES (evolutionary strategy) [40] is an algorithm that generally distributes equal importance to mutation and 4.1. General Performance of HS/BA. In order to explore recombination and that allows two or more parents to the benefits of HS/BA, in this subsection we compared its reproduce an offspring. A GA (genetic algorithm) [ 18]isa performance on global numeric optimization problem with search heuristic that mimics the process of natural evolution. nine other population-based optimization methods, which PSO (particle swarm optimization) [21] is also a swarm Journal of Applied Mathematics 7 intelligence algorithm which is based on the swarm behavior normalizations in the tables are based on different scales, so of sfi hand bird schoolinginnature. Astudgenetic algorithm values cannot be compared between the two tables. (SGA) [41] is a GA that uses the best individual at each From Table 3, we see that, on average, HS/BA is the most generation for crossover. effective at finding objective function minimum on ten of In all experiments, we will use the same parameters for the fourteen benchmarks (F02, F03, F06–F11, and F13-F14). HS, BA and HS/BA that are loudness 𝐴 = 0.95 ,pulse rate ACO is the second most effective, performing the best on the 𝑟 = 0.6 ,scaling factor 𝜀 = 0.1 , the harmony memory benchmarks F04 and F05. BBO and SGA are the third most consideration rate HMCR = 0.95, and the pitch adjustment effective, performing the best on the benchmarks F12 and rate PAR = 0.1.For ACO, BBO, DE,ES, GA,PSO,and SGA, F01, respectively, when multiple runs are made. Table 4 shows we set the parameters as follows. For ACO, initial pheromone that HS/BA performs the best on thirteen of the fourteen value 𝜏 =1𝐸 − 6 , pheromone update constant 𝑄=20 , benchmarks (F02–F14), while BBO performs the second exploration constant 𝑞 =1, global pheromone decay rate 0 best at finding objective function minimum on the only 𝜌 = 0.9, local pheromone decay rate 𝜌 = 0.5, pheromone 𝑔 𝑙 benchmark (F01) when multiple runs are made. In addition, sensitivity 𝑠=1 , and visibility sensitivity 𝛽=5 ;for BBO, statistical analysis [42] on these values obtained by the ten habitat modification probability = 1, immigration probability methods on 14 benchmark functions based on the Friedman’s bounds per gene = [0, 1], step size for numerical integration of test [43] reveals that the dieff rences in the obtained average probabilities = 1, maximum immigration and migration rates and the best function minima are statistically signicfi ant ( 𝑃= −17 −17 for each island = 1, and mutation probability = 0.005; for DE, 1.66 ∗ 10 and 𝑃 = 7.25 ∗ 10 , resp.) at the confidence level aweighting factor 𝐹 = 0.5 and a crossover constant CR = 0.5; of 5%. for the ES, the number of offspring 𝜆=10 produced in Furthermore, the computational requirements of the ten each generation and standard deviation 𝜎=1 for changing optimization methods were similar. We collected the average solutions. For the GA, we used roulette wheel selection and computational time of the optimization methods as applied single-point crossover with a crossover probability of 1 and to the 14 benchmarks discussed in this section. eTh results are a mutation probability of 0.01. For PSO, we set an inertial showninTable 3. BA was the quickest optimization method. constant = 0.3, a cognitive constant = 1, and a social constant HS/BA was the third fastest of the ten algorithms. However, for swarm interaction = 1. For the SGA, we used single-point it should be noted that in the vast majority of real-world crossover with a crossover probability of 1 and a mutation applications, it is the tfi ness function evaluation that is by far probability of 0.01. the most expensive part of a population-based optimization Well-defined problem sets are favorable for evaluating algorithm. the performance of optimization methods proposed in this Furthermore, convergence graphs of ACO, BA, BBO, paper. Based on mathematical functions, benchmark func- DE, ES, GA, HS, HS/BA, PSO, and SGA are shown in tions can be applied as objective functions to perform such Figures 1–14 which represent the process of optimization. eTh tests. The properties of these benchmark functions can be values shown in Figures 1–14 are the mean objective function easily achieved from their definitions. Fourteen different optimumachievedfrom100 MonteCarlo simulations, which benchmark functions are applied to verify our proposed are the true objective function values, not normalized. metaheuristic algorithm HS/BA. Each of the functions in this Figure 1 showsthe resultsobtainedfor theten methods study has 20 independent variables (i.e., 𝐷=20 ). when the F01 Ackley function is applied. From Figure 1, The benchmark functions described in Table 1 are stan- clearly, we can draw the conclusion that BBO is significantly dard testing functions. eTh properties of the benchmark func- superior to all the other algorithms including HS/BA during tionsare giveninTable 2.Themodalitypropertymeans the the process of optimization, while HS/BA performs the sec- number of the best solutions in the search space. Unimodal ond best on this multimodal benchmark function. Here, all benchmark functions only have an optimum, which is the the other algorithms show the almost same starting point, and global optimum. Multimodal benchmark functions have at HS/BA has a faster convergence rate than other algorithms, least two optima in their search space, indicating that they while HS/BA is outperformed by BBO aeft r 13 generations. have more than one local optima except the global optimum. Although slower, SGA eventually finds the global minimum More details of all the benchmark functions can be found in close to HS/BA. Obviously, BBO, HS/BA, and SGA outper- [6]. form ACO, BA, DE, ES, GA, HS, and PSO during the whole We set population size = 50 ,anelitism parameter process of searching the global minimum. 𝑝𝐾𝑒𝑒 = 2 , and maximum generation𝑛=𝑔𝑒𝑥 50 for each algorithm. We ran 100 Monte Carlo simulations of each Figure 2 illustrates the optimization results for F02 Fletc- her-Powell function. In this multimodal benchmark problem, algorithm on each benchmark function to get representative performances. Tables 3 and 4 illustrate the results of the it is obvious that HS/BA outperforms all other methods dur- ing the whole progress of optimization (except the generation simulations. Table 3 shows the average minima found by each algorithm, averaged over 100 Monte Carlo runs. Table 4 20–33, BBO, and HS/BA performs approximately equally). shows the absolute best minima found by each algorithm Figure 3 shows the optimization results for F03 Griewank function. From Table 3 and Figure 3,wecan seethatHS/BA over 100 Monte Carlo runs. In other words, Table 3 shows the average performance of each algorithm, while Table 4 performs thebestonthismultimodalbenchmark problem. shows the best performance of each algorithm. eTh best value Looking at Figure 3 carefully, HS/BAshows afasterconver- gencerateinitially than ACO; however, it is outperformed achieved for each test problem is shown in bold. Note that the 𝑀𝑎 𝑁𝑃 𝑖𝑗 𝑖𝑗 𝑖𝑗 𝑖𝑗 8 Journal of Applied Mathematics 𝑥𝑛 󵄨 󵄨 󵄨 󵄨 𝑖 𝑥 𝑖 󵄨 󵄨 󵄨 󵄨 𝑖 𝑖 𝑥 𝑥 𝑗 𝐷 𝑒 𝑛 𝑥 𝑖 𝑖 𝑈 𝑛 󵄨 󵄨 󵄨 󵄨 𝑖 𝑥 𝑥 𝑛 𝑗 𝑖 𝑖 𝑥 𝑥 󵄨 󵄨 󵄨 󵄨 𝑛 𝑛 𝑒 𝑖 𝑛 𝑛 𝑛 𝑛 ⃗ ⃗ ⃗ ⃗ 𝑥 𝑥 ⃗ ⃗ ⃗ ⃗ ⃗ ⃗ ⃗ 𝑓 𝐵 𝑥 𝑥 𝑓 𝑓 𝑓 𝑓 Table 1: Benchmark functions. No. Name Definition Source −0.2⋅ (1/𝑛) ∑ √ (1/𝑛) ∑ cos(2𝜋 ) F01 Ackley [3] 𝑖=1 𝑖=1 𝑓( 𝑥) = 20 + − 20 ⋅ −𝑒 ( )= ∑(𝐴 −𝐵 ),𝐴 = ∑ (𝑎 sin +𝑏 cos ) 𝑖=1 𝑗=1 [4] F02 Fletcher-Powell = ∑ (𝑎 sin +𝑏 cos ) 𝑗=1 [5] F03 Griewank 𝑓( 𝑥) = ∑ − ∏ cos ( )+1 4000 √ 𝑖=1 𝑖=1 𝑛−1 2 2 2 2 ( )= {10 sin (𝜋 )+∑ (𝑦 −1) ⋅[1+10sin (𝜋 )] +(𝑦 −1)} 1 𝑖+1 30 [6] 𝑖=1 F04 Penalty #1 +∑𝑢( , 10, 100, 4) , = 1 + 0.25 (𝑥 +1) 𝑖=1 𝑛−1 2 2 2 2 2 ( )= 0.1 { sin (3𝜋 )+∑ (𝑥 −1) ⋅[1+ sin (3𝜋 )] + ( −1) [1 + sin (2𝜋 )]} 1 𝑖+1 𝑖=1 [6] F05 Penalty #2 +∑𝑢( , 5, 100, 4) 𝑖=1 Quartic with 𝑓( 𝑥) = ∑(𝑖 ⋅ + (0, 1)) F06 [7] noise 𝑖=1 𝑓( 𝑥) = 10 ⋅ 𝑛+ ∑ (𝑥 −10⋅ cos (2𝜋 )) F07 Rastrigin [8] 𝑖=1 𝑛−1 F08 Rosenbrock 𝑓( 𝑥) = ∑ [100(𝑥 −𝑥 ) +( −1)] [7] 𝑖+1 𝑖=1 1/2 F09 Schwefel 2.26 [9] ( )= 418.9829 × − ∑ sin ( ) 𝑖=1 F10 Schwefel 1.2 [10] 𝑓( 𝑥) = ∑ ( ∑ ) 𝑖=1 𝑗=1 𝑓( 𝑥) = ∑|𝑥 |+ ∏|𝑥 | F11 Schwefel 2.22 [10] 𝑖=1 𝑖=1 ( )= max { , 1 ≤𝑖≤𝑛} F12 Schwefel 2.21 [6] Journal of Applied Mathematics 9 ⃗ ⃗ Table 1: Continued. No. Name Definition Source F13 Sphere 𝑓( 𝑥) = ∑ [7] 𝑖=1 𝑓( 𝑥) = 6 ⋅ 𝑛+ ∑ ⌊ ⌋ F14 Step [7] 𝑖=1 In benchmark function F02, the matrix elements a , b ∈(−100,100), ∈(−,) are drawn from uniform distribution [4]. 𝑛×𝑛 𝑛×𝑛 𝑛×1 In benchmark functions F04 and F05, the definition of the function 𝑢(𝑥 ,𝑎,𝑘,𝑚) is given by𝑢(𝑥 ,𝑎,𝑘,𝑚)=( −𝑎) ,𝑥 >;0,−≤ ≤;(−𝑥 −𝑎) ,𝑥 <− . 10 Journal of Applied Mathematics Table 2: Properties of benchmark functions; lb denotes lower bound, ub denotes upper bound, and opt denotes optimum point. No. Function lb ub opt Continuity Modality F01 Ackley −32.768 32.768 0 Continuous Multimodal F02 Fletcher-Powell −𝜋 𝜋 0Continuous Multimodal F03 Griewank −600 600 0 Continuous Multimodal F04 Penalty #1 −50 50 0 Continuous Multimodal F05 Penalty #2 −50 50 0 Continuous Multimodal F06 Quartic with noise −1.28 1.28 1 Continuous Multimodal F07 Rastrigin −5.12 5.12 0 Continuous Multimodal F08 Rosenbrock −2.048 2.048 0 Continuous Unimodal F09 Schwefel 2.26 −512 512 0 Continuous Multimodal F10 Schwefel 1.2 −100 100 0 Continuous Unimodal F11 Schwefel 2.22 −10 10 0 Continuous Unimodal F12 Schwefel 2.21 −100 100 0 Continuous Unimodal F13 Sphere −5.12 5.12 0 Continuous Unimodal F14 Step −5.12 5.12 0 Discontinuous Unimodal Table 3: Mean normalized optimization results in fourteen benchmark functions. eTh values shown are the minimum objective function values found by each algorithm, averaged over 100 Monte Carlo simulations. ACO BA BBO DEES GA HS HSBA PSO SGA F01 2.31 3.33 1.15 2.02 3.38 2.72 3.47 1.09 2.66 1.00 F02 24.58 25.82 1.58 8.94 24.35 5.45 15.69 1.00 13.96 1.33 F03 3.16 60.72 1.93 5.44 23.85 3.22 77.22 1.00 25.77 1.42 F04 1.00 3.0E38 4.0E32 5.6E33 2.7E38 3.1E32 1.4E39 2.3E32 4.1E36 9.6E31 F05 1.00 1.1E8 299.42 1.5E6 4.6E85.4E3 4.1E8 215.51 5.5E7 111.10 F06 489.01 6.8E3 35.32 308.29 1.8E4 274.83 1.5E4 1.00 2.5E3 10.09 F07 8.09 11.55 1.28 6.56 11.87 6.17 10.22 1.00 8.44 1.80 F08 42.25 29.01 2.29 7.59 59.99 9.05 47.85 1.00 12.04 2.15 F09 3.17 20.26 1.99 13.58 13.33 1.81 19.92 1.00 17.61 1.15 F10 1.75 3.73 1.38 2.95 4.93 1.25 4.22 1.00 2.48 1.48 F11 1.05 19.70 1.83 7.14 23.12 11.13 19.45 1.00 13.22 2.46 F12 1.86 4.03 1.00 2.99 3.91 1.92 3.74 1.38 2.38 1.12 F13 98.30 150.84 3.80 19.03 226.52 47.74 182.32 1.00 72.91 4.02 F14 7.73 120.48 3.93 13.31 102.56 11.53 146.55 1.00 63.44 3.28 TimE 2.74 1.00 1.32 1.64 1.67 1.79 2.33 1.43 2.03 1.76 The values are normalized so that the minimum in each row is 1.00. es Th e are not the absolute minima found by each algorithm, but the average minima found by each algorithm. by ACO after 5 generations. For other algorithms, although superior to all the other algorithms during the process of slower, BBO, GA, and SGA eventually find the global mini- optimization. Here, PSO shows a faster convergence rate mum close to HS/BA, while BA, DE, ES, HS, and PSO fail to initially than HS/BA; however, it is outperformed by HS/BA search the global minimum within the limited iterations. aer ft 3 generations. Figure 4 showsthe resultsfor F04Penalty #1 function. Figure 6 showsthe resultsachievedfor theten methods From Figure 4, it is obvious that HS/BA outperforms all other when using the F06 Quartic (with noise)function. From methods during the whole progress of optimization in this Table 3 and Figure 6, we can conclude that HS/BA performs multimodal benchmark function. Although slower, SGA and the best in this multimodal function. Looking carefully BBO perform the second and third best at n fi ding the global at Figure 6, PSO and HS/BA show the same initial fast minimum. convergence towards the known minimum, as the procedure Figure 5 shows the performance achieved for F05 Penalty proceeds, HS/BA gets closer and closer to the minimum, #2 function. For this multimodal function, similar to F04 while PSO comes into being premature and traps into the Penalty #1 function shown in Figure 4, HS/BA is significantly local minimum. BA, ES, GA, HS, and PSO do not manage Journal of Applied Mathematics 11 Table 4: Best normalized optimization results in fourteen benchm ark functions. eTh values shown are the minimum objective function values found by each algorithm. ACO BA BBO DEES GA HS HSBA PSO SGA F01 1.85 2.31 1.00 1.43 2.25 2.03 2.30 1.05 1.91 1.04 F02 10.42 14.48 1.09 4.22 10.39 4.03 9.91 1.00 8.28 1.33 F03 2.73 46.22 1.87 4.47 21.38 8.02 43.24 1.00 16.75 1.76 F04 4.4E65.0E61.2E31.9E42.2E63.0E4 4.2E6 1.00 3.755 3.22 F05 3.0E43.2E4 51.01 386.33 1.6E4 884.46 2.6E4 1.00 4.113 4.57 F06 58.51 992.55 6.50 24.69 808.48 59.24 746.91 1.00 189.71 2.02 F07 5.71 8.53 1.25 5.13 7.94 5.23 7.47 1.00 6.00 1.68 F08 26.42 25.03 1.48 3.70 33.52 6.16 21.50 1.00 7.75 1.45 F09 2.43 8.66 1.28 4.90 5.93 2.09 7.28 1.00 7.40 1.42 F10 1.89 4.94 1.18 2.66 3.00 2.08 2.76 1.00 2.01 1.74 F11 7.74 12.61 1.21 3.36 12.14 6.01 9.60 1.00 7.81 1.62 F12 1.33 2.33 1.44 1.69 2.08 1.74 2.11 1.00 1.70 1.21 F13 28.04 56.57 2.17 5.54 60.57 19.71 52.86 1.00 20.98 2.28 F14 4.28 54.02 1.97 4.85 33.61 10.60 49.11 1.00 19.72 1.85 The values are normalized so that the minimum in each row is 1.00. es Th e are the absolute best minima found by each algorithm. 0 5 10 15 20 25 30 35 40 45 50 Number of generations GA ACO BA HS 0 10 20 30 40 50 BBO HSBA Number of generations DE PSO GA ACO ES SGA HS BA HSBA Figure 2: Comparison of the performance of the different methods BBO DE PSO for the F02 Fletcher-Powell function. ES SGA Figure 1: Comparison of the performance of the different methods for the F01 Ackley function. GA, and PSO performing the second best; another group including BA, ES, and HS performing the worst. Figure 8 shows the results for F08 Rosenbrock function. From Table 3 and Figure 8,wecan seethatHS/BA is superior to succeed in this benchmark function within maximum to the other algorithms during the optimization process in number of generations. At last, BBO, DE, and SGA converge this relatively simple unimodal benchmark function. Also, to thevalue very closetoHS/BA. we see that, similar to the F06 Quartic (with noise)function Figure 7 shows the optimization results for the F07 shown in Figure 6, PSO shows a faster convergence rate Rastrigin function. Very clearly, HS/BA has the fastest con- initially than HS/BA; however, it is outperformed by HS/BA vergence rate at finding the global minimum and significantly aer ft 4 generations and is premature to the local minimum. outperforms all other approaches. Looking carefully at Fig- Figure 9 showsthe equivalent resultsfor theF09 Schwefel ure 7, approximately, we can divide all the algorithms into 2.26 function. From Figure 9,veryclearly,thoughHS/BA is three groups: one group including BBO, HS/BA, and SGA outperformed by BBO between the generations 10–30, it has performing the best; the other group including ACO, DE, the stable convergence rate at n fi ding the global minimum Benchmark function value Benchmark function value 12 Journal of Applied Mathematics 350 10 0 5 10 15 20 25 30 35 40 45 50 Number of generations ACO GA BA HS 0 10 20 30 40 50 BBO HSBA Number of generations DE PSO ES ACO GA SGA BA HS BBO HSBA Figure 5: Comparison of the performance of the different methods DE for the F05 Penalty #2 function. PSO ES SGA Figure 3: Comparison of the performance of the different methods for the F03 Griewank function. 10 5 0 5 10 15 20 25 30 35 40 45 50 0 0 10 20 30 40 50 Number of generations Number of generations ACO GA ACO GA BA HS BA HS BBO HSBA BBO HSBA DE PSO DE PSO ES SGA ES SGA Figure 4: Comparison of the performance of the different methods Figure 6: Comparison of the performance of the different methods for the F04 Penalty #1 function. for the F06 Quartic (with noise) function. and signicfi antly outperforms all other approaches in this multimodal benchmark function. divided into three groups: one group including BBO and Figure 10 shows the results for F10 Schwefel 1.2 function. HS/BA performing the best; the other group including ACO, From Figure 10,wecan seethatHS/BA performs farbetter GA, PSO, and SGA performing the second best; another than other algorithms during the optimization process in this group including DE, ES, and HS performing the worst. relative simple unimodal benchmark function. PSO shows Figure 11 shows the results for F11 Schwefel 2.22 function. a faster convergence rate initially than HS/BA; however, Very clearly, BBO shows a faster convergence rate initially it is outperformed by HS/BA aeft r 3 generations. Looking than HS/BA; however, it is outperformed by HS/BA aeft r 40 carefully at Figure 10, akin to F07 Rastrigin function shown in generations. At last, HS/BA reaches the optimal solution sig- Figure 7, all the algorithms except BA can be approximately nicfi antly superior to other algorithms. BBO is only inferior Benchmark function value Benchmark function value Benchmark function value Benchmark function value Journal of Applied Mathematics 13 250 6000 0 10 20 30 40 50 0 10 20 30 40 50 Number of generations Number of generations ACO GA ACO GA BA HS BA HS BBO HSBA BBO HSBA DE PSO DE PSO ES SGA ES SGA Figure 9: Comparison of the performance of the different methods Figure 7: Comparison of the performance of the different methods for the F09 Schwefel 2.26 function. for the F07 Rastrigin function. 0 5 10 15 20 25 30 35 40 45 50 Number of generations ACO GA BA HS BBO HSBA 0 10 20 30 40 50 DE PSO Number of generations ES SGA ACO GA Figure 10: Comparison of the performance of the different methods BA HS for the F10 Schwefel 1.2 function. BBO HSBA DE PSO ES SGA Figure 13 shows the results for F13 Sphere function. From Figure 8: Comparison of the performance of the different methods Table 3 and Figure 13, HS/BA has the fastest convergence for the F08 Rosenbrock function. rate at finding the global minimum and outperforms all other approaches. Looking carefully at Figure 13,for BBOand SGA, we cansee that BBOhas afasterconvergence rate than SGA, to HS/BA and performs the second best in this unimodal but SGA does finally converge to the value of BBO that is function. approaching to HS/BA. Figure 12 showsthe resultsfor F12Schwefel2.21function. Figure 14 showsthe resultsfor F14Stepfunction. Appar- Veryclearly,HS/BAhasthefastestconvergencerateatfinding ently, HS/BA shows the fastest convergence rate at ndin fi g the global minimum and significantly outperforms all other the global minimum and significantly outperforms all other approaches. approaches. Looking carefully at Figure 14,for BBOand SGA, Benchmark function value Benchmark function value Benchmark function value Benchmark function value 14 Journal of Applied Mathematics 0 5 10 15 20 25 30 35 40 45 50 20 Number of generations ACO GA BA HS 0 10 20 30 40 50 BBO HSBA Number of generations DE PSO ES ACO GA SGA BA HS BBO HSBA Figure 11: Comparison of the performance of the different methods DE for the F11 Schwefel 2.22 function. PSO ES SGA Figure 13: Comparison of the performance of the different methods for the F13 Sphere function. 0 5 10 15 20 25 30 35 40 45 50 0 10 20 30 40 50 Number of generations Number of generations ACO GA ACO GA BA HS BA HS BBO HSBA BBO HSBA DE PSO DE PSO ES SGA ES SGA Figure 14: Comparison of the performance of the different methods Figure 12: Comparison of the performance of the different methods for the F14 Step function. for the F12 Schwefel 2.21 function. we can see that BBO has a faster convergence rate than SGA, convergence rate initially, while later it converges slower but SGA does finally converge to the value of BBO. and slower to the true objective function value. At last, From the above-analyses about Figures 1–14,wecan come we must point out that, in [24], Simon compared BBO to the conclusion that our proposed hybrid metaheuristic with seven state-of-the-art EAs over 14 benchmark func- algorithm HS/BA significantly outperforms the other nine tions and a real-world sensor selection problem. eTh results algorithms.Ingeneral,ACO,BBO,and SGAare only inferior demonstrated the good performance of BBO. It is also indi- to the HS/BA, and ACO, BBO, and SGA perform better rectly demonstrated that our proposed hybrid metaheuristic than HS/BA in the benchmarks F04-F05, the benchmark F12, method HS/BA is a more powerful and efficient optimiza- and the benchmark F01, respectively. Further, benchmarks tion algorithm than other population-based optimization F04, F05, F06, F08, and F10 illustrate that PSO has a faster algorithms. Benchmark function value Benchmark function value Benchmark function value Benchmark function value Journal of Applied Mathematics 15 Table 5: Best normalized optimization results in fourteen benchmark functions with different 𝐴 .Thenumbers shownare thebestresults found aeft r 100 Monte Carlo simulations HS/BA algorithm. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.66 1.33 1.55 1.40 1.00 1.33 1.21 1.22 1.16 1.13 1.14 F02 1.25E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 3.41 11.15 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F04 2.73E4 1.00 1.25E4 1.04 1.04 1.04 1.04 1.04 1.04 1.00 1.00 F05 5.03E5 5.30 1.33 1.68E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F06 1.00 2.56 5.47 4.73 17.81 9.90 2.89 6.74 9.90 2.60 5.57 F07 10.14 14.92 4.35 1.10 1.10 15.37 1.01 1.00 1.00 1.00 1.00 F08 38.49 1.08 14.06 1.08 1.08 1.08 99.01 1.01 1.00 1.00 1.00 F09 285.18 404.58 1.38 4.74 1.19 1.19 1.19 367.91 1.01 1.00 1.00 F10 665.22 4.68 1.69E3 15.27 1.18 1.18 1.18 1.18 1.21 1.00 1.00 F11 20.03 1.00 9.96 11.54 39.22 9.96 9.96 9.96 9.96 38.97 8.49 F12 10.32 1.00 13.22 3.23 14.69 2.79 2.79 2.79 2.79 2.79 19.31 F13 1.46 4.15 2.84 4.47 1.15 1.56 1.00 1.00 1.00 1.00 1.00 F14 527.52 13.57 3.95 1.01 1.15 12.91 1.00 1.00 1.00 1.00 1.00 14 2 2 3 3 5 6 7 10 10 4.2. Inu fl ence of Control Parameter. In [28], Dr. Yang con- 4.2.2. Pulse Rate: 𝑟 . Tables 7 and 8 recorded the results cluded that if the parameters in BA can be adjusted properly, performed on the benchmark problem with the pulse rate it can outperform GA, HS (harmony search) [26], and PSO. 𝑟 = 0, 0.1, 0.2, . . . , 0.9, 1.0 and xfi ed loudness 𝐴 = 0.95 , eTh choice of the control parameters is of vital importance harmony memory consideration rate HMCR = 0.95, and for different problems. To investigate the influence of the pitch adjustment rate PAR = 0.1. From Tables 7 and 8,we loudness 𝐴 ,pulse rate 𝑟 , harmony memory consideration rate can evidently conclude that HS/BA performs signicfi antly HMCR, and pitch adjustment rate PAR on the performance better on bigger 𝑟 (>0.5) than on smaller 𝑟 (<0.5), and HS/BA of HS/BA, we carry out this experiment in the benchmark performs the best when r is equal or very close to 0.6. So, we problem with different parameters. All other parameter set 𝑟 = 0.6 in other experiments. settings are kept unchanged (unless noted otherwise in the following paragraph). eTh results are recorded in Tables 5– 4.2.3. Harmony Memory Consideration Rate: 𝑅𝐶 . Tables 12 after 100 Monte Carlo runs. Among them, Tables 5, 7, 9, 9 and 10 recorded the results performed in the bench- and 11 show the best minima found by HS/BA algorithm over mark problem with the harmony memory consideration rate 100 Monte Carlo runs. Tables 6, 8, 10,and 12 show the average HMCR = 0, 0.1, 0.2, . . . , 0.9, 1.0, xfi ed loudness 𝐴 = 0.95 , minima found by HS/BA algorithm, averaged over 100 Monte pulse rate 𝑟 = 0.6 ,and pitchadjustmentratePAR =0.1.From Carlo runs. In other words, Tables 5, 7, 9, 11 and Tables 6, Tables 9 and 10,wecan recognizethatthe function values 8, 10, 12 show the best and average performance of HS/BA evaluated by HS/BA are better/smaller on bigger HMCR algorithm, respectively. In each table, the last row is the total (>0.5) than on smaller r (<0.5), and most benchmarks reach number of functions on which HS/BA performs the best with minimum when HMCR is equal or very close to 1. So, we set some parameters. HMCR = 0.95 in other experiments. 4.2.1. Loudness: 𝐴 . Tables 5 and 6 recorded the results 4.2.4. Pitch Adjustment Rate: 𝑅 . Tables 11 and 12 recorded the results performed on the benchmark problems with the performed in the benchmark problem with the loudness 𝐴= pitch adjustment rate PAR = 0, 0.1, 0.2, . ..,0.9,1.0,fixed 0, 0.1, 0.2, . . . , 0.9, 1.0 and xe fi d pulse rate 𝑟 = 0.6 ,harmony memory consideration rate HMCR = 0.95, and pitch adjust- loudness 𝐴 = 0.95 ,pulse rate 𝑟 = 0.6 ,and harmony memory consideration rate HMCR =0.95.FromTables 11 ment rate PAR = 0.1. From Tables 5 and 6,obviously,itcan be and 12, we can recognize that the function values for HS/BA seen that (i) for the three benchmark functions F02, F03, and F04, HS/BA performs slightly differently, that is to say, these vary little with the increasing PAR, and HS/BA reaches optimum/minimum in most benchmarks when PAR is equal three benchmark functions are insensitive to the parameter 𝐴 .(ii) Forbenchmark functionsF01,F06,F11, andF12, or very close to 0.1. So, we set PAR =0.1 in other experiments. HS/BA performs better on smaller 𝐴 (<0.5). (iii) However, for functions F05, F07–10, F13, and F14, HS/BA performs better 4.3. Discussion. In the HS/BA, the bats yfl in the sky using on bigger 𝐴 (>0.5). In sum, HS/BA performs the best when 𝐴 echolocation to find food/prey (i.e., best solutions). Four is equal or very close to 1.0. So, we set 𝐴 = 0.95 ,which is very other parameters are the loudness (𝐴 ), therateofpulse close to 0.9 and 1.0 in other experiments. emission (𝑟 ), harmony memory consideration rate (HMCR), 𝑃𝐴 𝐻𝑀 16 Journal of Applied Mathematics Table 6: Mean normalized optimization results in fourteen benchmark functions with different A. eTh numbers shown are the minimum objective function values found by HS/BA algorithm, averaged over 100 Monte Carlo simulations. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.01 1.01 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F02 1.52 1.00 1.46 1.17 1.08 1.37 1.88 1.65 1.67 1.79 1.48 F03 6.69 10.41 1.06 1.01 1.03 1.03 1.01 1.02 1.05 1.00 1.00 F04 201.31 1.26 4.10 4.21 3.33 2.72 1.00 5.19 2.91 1.00 3.11 F05 591.22 16.82 4.25 5.87 1.01 4.14 24.04 1.00 3.68 7.33 9.00 F06 1.00 4.90E4 637.99 258.71 4.95E4 2.18 2.17 2.17 2.18 2.17 2.17 F07 8.57 2.21E6 6.43 272.34 110.48 2.11E4 1.02 1.01 1.00 1.00 1.00 F08 49.80 1.14E41.72E4 161.60 224.59 91.19 1.74E4 1.03 1.11 1.00 1.00 F09 82.37 1.73E6 1.06 3.14 74.45 103.10 42.26 7.94E3 1.00 1.37 1.29 F10 90.31 1.45E32.45E52.32E3 23.34 22.34 31.38 12.89 2.34E31.40 1.00 F11 4.98 1.00 3.15E4 2.33 14.41 520.23 443.65 616.26 249.91 4.78E42.14 F12 3.69 2.10E4 4.57E4 1.00 2.12E4 266.35 233.13 198.80 276.14 112.01 2.14E4 F13 1.90 8.25 4.48E62.14E6 1.00 6.15 254.51 222.75 189.96 263.86 107.01 F14 66.91 1.95E51.17E41.24E3 21.04 1.86E329.40 23.92 1.00 18.35 24.75 12 1 2 2 1 2 2 4 5 5 Table 7: Best normalized optimization results in fourteen benchmark functions with different 𝑟 .Thenumbers shownare thebestresults found aeft r 100 Monte Carlo simulations of HS/BA algorithm. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.84 1.84 1.84 4.84 1.59 1.71 1.00 1.34 1.09 1.16 1.01 F02 9.44E3 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 3.73 5.46 1.86 1.86 1.86 1.86 1.00 1.72 1.21 1.61 1.01 F04 11.42 1.11 24.29 1.23 1.26 1.00 1.29 1.29 1.29 1.29 1.29 F05 1.77E5 2.30 1.15 1.09E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F06 62.44 29.96 48.58 15.60 26.34 7.80 1.82 1.00 35.20 2.39 1.55 F07 8.31 4.48 2.58 1.29 1.29 3.49 1.00 1.00 1.00 1.00 1.00 F08 9.61 1.18 4.20 1.37 1.37 1.37 6.95 1.01 1.01 1.00 1.00 F09 373.94 444.22 1.68 3.35 1.68 1.68 1.00 291.98 1.01 1.01 1.01 F10 386.93 3.16 13.97 4.63 1.58 1.58 1.00 1.58 309.31 1.58 1.01 F11 42.55 1.00 18.29 21.35 20.18 11.11 19.04 21.35 15.81 18.76 11.55 F12 114.40 1.00 141.84 44.48 125.47 44.48 44.48 44.48 44.48 44.48 69.43 F13 6.39 6.37 6.01 2.96 3.02 1.00 1.52 1.39 1.00 2.57 2.49 F14 120.52 4.04 2.33 1.00 1.16 3.40 1.00 1.00 1.16 1.16 1.16 03 1 2 2 4 8 54 4 4 and pitch adjustment rate (PAR). The appropriate update for searching globally at the same time. It succeeds in doing thepitchadjustmentrate(PAR)andtherateofpulseemission this due to the local search via harmony search algorithm (𝑟 ) balances the exploration and exploitation behavior of each and global search via bat algorithm concurrently. A similar bat, respectively, as the loudness usually decreases once a bat behavior may be performed in the PSO by using multiswarm has found its prey/solution, while the rate of pulse emission from a particle population initially [44]. However, HS/BA’s increases in order to raise the attack accuracy. advantages include performing simply and easily, and only For all of the standard benchmark functions that have having four parameters to regulate. eTh work carried out been considered, the HS/BA has been demonstrated to here demonstrates the HS/BA to be robust over all kinds of performbetterthanorbeequal to thestandardBAand other benchmark functions. acclaimed state-of-the-art population-based algorithms with Benchmark evaluation is a good way for verifying the the HS/BA performing significantly better in some functions. performanceofthe metaheuristicalgorithms, butitalsohas The HS/BA performs excellently and efficiently because of limitations. First, we did not make any special effort to tune its ability to simultaneously carry out a local search, still the optimization algorithms in this section. Different tuning Journal of Applied Mathematics 17 Table 8: Mean normalized optimization results in fourteen benchmark functions with different 𝑟 . eTh numbers shown are the minimum objective function values found by HS/BA algorithm, averaged over 100 Monte Carlo simulations. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.84 1.99 1.86 1.94 1.59 1.00 1.42 1.34 1.09 1.16 1.71 F02 3.31 5.38 4.57 3.08 3.82 1.14 1.00 2.89 2.79 4.02 3.11 F03 3.73 5.46 5.33 2.29 3.36 2.41 1.00 1.72 1.21 1.61 1.36 F04 11.42 1.11 24.29 1.49E41.13E33.94E32.78E419.66 175.22 1.14 1.00 F05 16.09 128.62 32.69 24.46 64.40 12.77 29.59 76.17 4.56 1.00 4.46 F06 62.44 26.96 48.58 15.60 26.34 7.80 1.82 1.00 35.20 2.39 1.55 F07 3.30 1.78 1.34 1.03 1.23 1.39 1.55 1.00 1.19 1.36 3.49 F08 4.88 3.93 3.94 1.94 2.19 3.83 3.53 4.33 1.00 2.27 3.14 F09 1.39 1.66 1.14 1.00 1.21 1.15 1.15 1.09 1.08 1.04 1.09 F10 7.21 9.94 8.75 3.60 8.46 6.11 4.83 4.14 5.76 1.00 2.71 F11 3.82 4.23 3.73 2.05 1.81 1.00 1.71 2.20 1.42 1.68 1.89 F12 1.64 2.17 2.04 1.47 1.80 1.86 1.00 1.21 1.13 1.34 1.56 F13 6.39 6.37 6.01 2.96 3.02 1.90 1.52 1.39 1.00 2.57 2.49 F14 5.74 15.43 7.42 6.82 6.44 3.17 1.00 4.85 2.21 1.74 2.09 00 0 1 0 2 4 22 2 1 Table 9: Best normalized optimization results in fourteen benchmark functions with different HMCR. The numbers shown are the best results found aer ft 100 Monte Carlo simulations of HS/BA algorithm. HMCR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.64 1.64 1.64 1.64 1.64 1.64 1.64 1.51 1.28 1.00 1.63 F02 1.87E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 19.82 26.49 3.71 3.71 3.71 3.71 3.71 3.71 3.71 2.06 1.00 F04 1.26E5 1.00 1.87E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F05 6.07E55.34 1.00 1.87E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F06 108.92 246.31 365.04 345.40 338.57 234.65 143.49 45.28 23.30 1.00 25.75 F07 9.22 14.28 5.34 1.00 1.00 9.30 1.00 1.00 1.00 1.00 1.00 F08 18.10 1.08 7.72 1.08 1.08 1.08 35.94 1.00 1.00 1.00 1.00 F09 280.04 391.61 1.27 6.81 1.27 1.27 1.27 227.95 1.00 1.00 1.00 F10 551.65 8.76 1.76E3 11.70 1.64 1.64 1.64 1.64 274.48 1.00 1.00 F11 14.67 1.00 6.18 6.18 13.99 6.18 6.18 6.18 4.40 2.69 6.16 F12 7.68 1.00 11.10 2.73 10.21 2.73 2.73 2.73 2.73 2.73 4.73 F13 7.72 26.75 15.90 17.76 6.12 10.40 6.12 5.95 2.55 1.00 2.25 F14 537.16 14.28 5.34 1.00 1.00 7.13 1.00 1.00 1.00 1.00 1.00 04 2 4 5 3 5 6 7 11 9 parameter values in the optimization algorithms might result here are promising for HS/BA and indicate that this novel in significant differences in their performance. Second, real- method might be able to nd fi a niche among the plethora of world optimization problems may not have much of a population-based optimization algorithms. relationship to benchmark functions. Third, benchmark tests We note that CPU time is a bottleneck to the implemen- might result in different conclusions if the grading criteria or tation of many population-based optimization algorithms. problem setup change. In our work, we examined the mean If an algorithm cannot converge fast, it will be impractical, and best results obtained with a certain population size and sinceitwould take toolongtofind an optimalorsuboptimal aer ft a certain number of generations. However, we might solution. HS/BA does not seem to require an unreasonable arrive at different conclusions if (for example) we change the amount of computational eo ff rt; of the ten optimization algo- generation limit, or look at how many generations it takes to rithms compared in this paper, HS/BA was the third fastest. reach a certain function value, or if we change the population Nevertheless, ndin fi g mechanisms to accelerate HS/BA could size. In spite of these caveats, the benchmark results shown be an important area for further research. 18 Journal of Applied Mathematics Table 10: Mean normalized optimization results in fourteen benchmark functions with different HMCR. The numbers shown are the best results found aer ft 100 Monte Carlo simulations of HS/BA algorithm. HMCR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.01 1.01 1.01 1.01 1.01 1.01 1.01 1.01 1.01 1.00 1.01 F02 747.56 7.87 5.25 3.43 5.09 7.10 3.07 1.87 1.60 1.00 1.01 F03 9.13 3.12E4 1.09 1.07 1.05 1.06 1.05 1.02 1.01 1.01 1.00 F04 2.10𝐸6 1.00𝐸4 3.95𝐸4 9.48𝐸3 5.97𝐸3 2.95𝐸3 1.51𝐸3 1.50𝐸3 46.94 1.00 2.37 F05 3.24𝐸6 3.27𝐸4 2.09𝐸4 4.14𝐸4 7.13𝐸3 1.54𝐸3 8.93𝐸3 1.53𝐸3 629.48 1.00 203.73 F06 1.00 5.94𝐸4 422.04 632.15 6.00𝐸4 1.92 1.91 1.91 1.91 1.91 1.91 F07 10.63 2.07𝐸6 9.00 216.72 324.55 3.07E4 1.04 1.03 1.02 1.01 1.00 F08 59.50 9.88𝐸3 3.04E4 141.80 217.01 324.68 3.07E4 1.07 1.07 1.03 1.00 F09 226.87 4.95𝐸6 3.08 8.91 114.22 173.60 259.41 2.45E41.45 1.00 1.71 F10 257.37 2.83𝐸4 9.35E51.37E4 97.09 66.56 99.25 149.40 1.39E4 1.00 2.73 F11 6.07 1.00 1.83E4 2.02 16.61 390.40 262.99 403.00 603.61 5.72E41.84 F12 3.00 2.79E43.60E4 1.00 2.82E4 270.95 194.14 130.77 200.40 300.16 2.84E4 F13 2.32 9.66 5.61E61.89E6 1.00 8.15 267.78 191.86 129.23 198.05 296.65 F14 184.66 3.84E51.17E41.85E3 1.00 5.71E3 25.11 55.39 39.60 26.57 41.49 1 1 0 1 2 000 0 63 Table 11: Best normalized optimization results in fourteen benchmar k functions with different PAR. eTh numbers shown are the best results found aeft r 100 Monte Carlo simulations of HS/BA algorithm. PAR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F02 3.91E3 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 1.00 4.25 1.51 2.20 2.20 2.12 2.20 1.99 2.20 1.28 1.80 F04 1.00 2.55 65.54 2.55 2.55 2.55 2.55 2.55 2.55 2.55 2.55 F05 2.52 1.00 1.71 1.69E3 1.71 1.71 1.71 1.71 1.71 1.71 1.71 F06 1.00 59.87 34.18 22.85 46.52 23.44 43.54 44.84 36.88 18.99 8.12 F07 8.07 1.00 1.48 2.55 2.55 13.06 2.55 2.55 2.55 2.55 2.55 F08 5.43 1.00 1.92 1.00 1.00 1.00 11.45 1.00 1.00 1.00 1.00 F09 76.72 2.52 1.17 1.00 1.71 1.71 1.71 155.56 1.71 1.71 1.71 F10 929.10 1.48 1.00 4.91 2.55 2.55 2.55 2.22 2.35E3 2.55 2.55 F11 3.18E3 1.00 5.82E33.99E33.39E35.82E3 4.91E35.82E35.82E39.58E35.82E3 F12 305.06 1.00 537.28 97.34 187.58 97.34 97.34 97.34 97.34 97.34 398.27 F13 1.92 4.22 5.87 10.07 12.98 4.66 1.48 4.01 3.17 1.00 4.09 F14 88.12 1.00 1.48 2.55 2.55 4.91 2.55 2.55 2.55 2.55 2.55 4 8 3 4 3 323 3 4 3 5. Conclusion and Future Work if it has better tfi ness. This selection scheme is rather greedy, which oeft n overtakes original HS and BA. The HS/BA This paper proposed a hybrid metaheuristic HS/BA method attempts to take merits of the BA and the HS in order to for optimization problem. We improved the BA by combining avoid all bats getting trapped in inferior local optimal regions. original harmony search (HS) algorithm and evaluating the The HS/BA enables the bats to have more diverse exemplars HS/BA on multimodal numerical optimization problems. to learn from, as the bats are updated each iteration and A novel type of BA model has been presented, and an also form new harmonies to search in a larger search space. improvement is applied to mutate between bats using har- This new method can speed up the global convergence rate mony search algorithmduringthe processofbatsupdating. without losing the strong robustness of the basic BA. From Using the original configuration of the bat algorithm, we the analysis of the experimental results, we observe that the generate the new harmonies based on the newly generated proposed HS/BA makes good use of the information in past bat each iteration aeft r bat’s position has been updated. eTh solutions more eeff ctively to generate better quality solutions new harmony vector substitutes the newly generated bat only frequently, when compared to the other population-based Journal of Applied Mathematics 19 Table 12: Mean normalized optimization results in fourteen benchmar k functions with different PAR. eTh numbers shown are the best results found aeft r 100 Monte Carlo simulations of HS/BA algorithm. PAR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F02 4.07 1.00 1.00 4.52 2.77 2.57 3.41 5.36 3.69 4.89 1.03 F03 1.00 1.27E4 1.34 1.35 1.35 1.35 1.34 1.35 1.34 1.34 1.34 F04 1.15 81.27 9.39E3 1.00 1.01 1.01 129 1.00 1.00 1.01 1.01 F05 345.73 50.61 86.19 9.07E3 25.01 1.08 1.01 1.91 3.33 1.00 1.18 F06 1.00 5.42E6 4.27E4 4.74E45.48E6 577.88 577.88 577.88 577.88 577.87 577.87 F07 4.86 1.53 1.00 95.43 105.96 1.22E4 1.34 1.36 1.32 1.33 1.35 F08 12.69 76.00 8.76E3 17.03 69.13 76.72 8.85E3 1.10 1.05 1.04 1.00 F09 63.45 230.47 1.82 1.67 12.78 48.60 53.49 6.09E3 1.11 1.30 1.00 F10 133.55 12.51 1.26 1.97E3 11.48 4.64 16.45 18.93 1.99E3 1.00 1.88 F11 63.02 1.00 6.39E3 79.44 59.24 3.96E31.42E35.81E3 6.45E37.45E579.81 F12 3.90 8.81E3 8.90E3 1.00 8.90E3 44.40 47.78 17.25 70.11 77.84 8.99E3 F13 1.00 21.66 2.07E3 6.69 5.80 4.30 271.67 282.46 105.39 429.26 476.62 F14 46.14 1.00 36.10 55.93 1.22 6.40E3 42.15 32.89 34.81 12.72 51.29 44 33 1 1 1 2 2 3 3 optimization algorithms such as ACO, BA, BBO, DE, ES, References GA, HS, PSO, and SGA. Based on the results of the ten [1] G. Wang, L. Guo, A. H. Gandomi et al., “Lev ´ y-flight krill herd approaches on the test problems, we can conclude that the algorithm,” Mathematical Problems in Engineering,vol.2013, HS/BA significantly improves the performances of the HS Article ID 682073, 14 pages, 2013. and the BA on most multimodal and unimodal problems. [2] A.H.Gandomi,X.S.Yang, S. Talatahari,and A. H. Alavi, In this work, 14 benchmark functions are used to evaluate Metaheuristic Applications in Structures and Infrastructures, the performance of our approach; we will test our approach Elsevier, Waltham, Mass, USA, 2013. on more problems, such as the high-dimensional (𝐷≥ [3] D. H. Ackley, “An empirical study of bit vector function 20) CEC 2010 test suit [45] and the real-world problems. optimization,” in Genetic Algorithms and Simulated Annealing, Moreover, we will compare HS/BA with other metaheuristic L. Davis, Ed., pp. 170–204, Pitman, London, UK, 1987. method, such as DLHS [46, 47], SGHS [48], adaptive DE [4] R. Fletcher and M. J. D. Powell, “A rapidly convergent descent [49], CLPSO [50], and DMS-PSO [51]. In addition, we only method for minimization,” The Computer Journal ,vol.6,no. 2, consider the unconstrained function optimization in this pp.163–168,1963. work. Our future work consists on adding the diversity rules [5] A. O. Griewank, “Generalized descent for global optimization,” into HS/BA for constrained optimization problems, such as JournalofOptimizationTheory andApplications ,vol.34, no.1, constrained real-parameter optimization CEC 2010 test suit pp.11–39,1981. [52]. [6] X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made In the eld fi of optimization, there are many issues worthy faster,” IEEE Transactions on Evolutionary Computation,vol.3, of further study, and efficient optimization method should be no.2,pp. 82–102,1999. developed depending on the analysis of specific engineering [7] K.A.DeJong, Analysis of the Behavior of a Class of Genetic problem. Our future work will focus on the two issues: on Adaptive Systems, University of Michigan, 1975. the one hand, we would apply our proposed approach HS/ [8] L. A. Rastrigin, Extremal Control Systems,Nauka,Moscow, Rus- BA to solve practical engineering optimization problems, sian, 1974, Theoretical Foundations of Engineering Cybernetics and, obviously, HS/BA can become a fascinating method for Series. real-world engineering optimization problems; on the other [9] S.-K. S. Fan and J.-M. Chang, “Dynamic multi-swarm particle hand, we would develop new meta-hybrid approach to solve swarm optimizer using parallel PC cluster systems for global optimization problem. optimization of large-scale multimodal functions,” Engineering Optimization,vol.42, no.5,pp. 431–451, 2010. [10] W. Gong, Z. Cai, and C. X. Ling, “DE/BBO: a hybrid differential Acknowledgments evolution with biogeography-based optimization for global numerical optimization,” Soft Computing ,vol.15, no.4,pp. 645– This work was supported by State Key Laboratory of Laser 665, 2010. Interaction with Material Research Fund under Grant no. [11] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “A modified SKLLIM0902-01 and Key Research Technology of Electric- firefly algorithm for UCAV path planning,” International Jour- discharge Non-chain Pulsed DF Laser under Grant no. LXJJ- nalofHybridInformation Technology,vol.5,no. 3, pp.123–144, 11-Q80. 2012. 20 Journal of Applied Mathematics [12] G.Wang,L.Guo,H.Duan,L.Liu,andH.Wang,“Ahybridmeta- [30] X. S. Yang and A. H. Gandomi, “Bat algorithm: a novel approach heuristic DE/CS algorithm for UCAV path planning,” Journal of for global engineering optimization,” Engineering Computa- Information and Computational Science,vol.9,no. 16,pp. 1–8, tions,vol.29, no.5,pp. 464–483, 2012. [31] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “A Bat algo- [13] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, and M. Shao, “Path rithm with mutation for UCAV path planning,” The Scientific planning for uninhabited combat aerial vehicle using hybrid World Journal,vol.2012, ArticleID418946, 15 pages, 2012. meta-heuristic DE/BBO algorithm,” Advanced Science, Engi- [32] R. Parpinelli and H. Lopes, “New inspirations in swarm intelli- neering and Medicine,vol.4,no. 6, pp.550–564,2012. gence: a survey,” International Journal of Bio-Inspired Computa- [14] G. Wang,L.Guo,H.Duan, H. Wang,L.Liu,and M. Shao,“A tion, vol. 3, no. 1, pp. 1–16, 2011. hybrid meta-heuristic DE/CS algorithm for UCAV three- [33] P. W. Tsai,J.S.Pan,B.Y.Liao, M. J. Tsai,and V. Istanda, “Bat dimension path planning,” The Scientific World Journal ,vol. algorithm inspired algorithm for solving numerical optimiza- 2012, Article ID 583973, 11 pages, 2012. tion problems,” AppliedMechanics andMaterials,vol.148,pp. [15] H.Duan, W. Zhao,G.Wang, andX.Feng, “Test-sheetcomposi- 134–137, 2012. tion using analytic hierarchy process and hybrid metaheuristic [34] Wikipedia, “Mathematical optimization,” http://en.wikipedia algorithm TS/BBO,” Mathematical Problems in Engineering,vol. .org/wiki/Numerical optimization. 2012,Article ID 712752,22pages,2012. [35] W. Y. Zhang, S. Xu, and S. J. Li, “Necessary conditions for weak [16] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “Dynamic sharp minima in cone-constrained optimization problems,” deployment of wireless sensor networks by biogeography based Abstract and Applied Analysis, vol. 2011, Article ID 909520, 11 optimization algorithm,” Journal of Sensor and Actuator Net- pages, 2012. works,vol.1,no. 2, pp.86–96,2012. [36] Wikipedia, “Global optimization,” http://en.wikipedia.org/ [17] X. S. Yang,A.H.Gandomi,S.Talatahari, andA.H.Alavi, wiki/Global optimization. Metaheuristics in Water, Geotechnical and Transport Engineer- [37] S. Kirkpatrick, C. D. Gelatt Jr., and M. P. Vecchi, “Optimization ing, Elsevier, Waltham, Mass, USA, 2013. by simulated annealing,” Science,vol.220,no.4598,pp.671–680, [18] D. E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, New York, NY, USA, 1998. [38] M. Dorigo and T. Stutzle, Ant Colony Optimization, eTh MIT [19] R. Storn and K. Price, “Differential evolution—a simple and Press, Cambridge, Mass, USA, 2004. efficient heuristic for global optimization over continuous [39] M. Dorigo, Optimization, Learning and Natural Algorithms, spaces,” JournalofGlobalOptimization,vol.11, no.4,pp. 341– Politecnico di Milano, Italy, 1992. 359, 1997. [40] H. G. Beyer, The Theory of Evolution Strategies , Springer, New [20] S. Das and P. N. Suganthan, “Differential evolution: a survey of York, NY, USA, 2001. the state-of-the-art,” IEEE Transactions on Evolutionary Com- [41] W. Khatib and P. Fleming, “eTh stud GA: a mini revolution?” in putation,vol.15, no.1,pp. 4–31,2011. Parallel Problem Solving From Nature, pp. 683–691, 1998. [21] J. Kennedy and R. Eberhart, “Particle swarm optimization,” [42] J. Derrac, S. Garc´ıa,D.Molina, andF.Herrera,“Apractical in Proceedings of the IEEE International Conference on Neural tutorial on the use of nonparametric statistical tests as a Networks, pp. 1942–1948, Perth, Australia, December 1995. methodology for comparing evolutionary and swarm intelli- [22] A. H. Gandomi, G. J. Yun, X.-S. Yang, and S. Talatahari, “Chaos- gence algorithms,” Swarm and Evolutionary Computation,vol. enhanced accelerated particle swarm optimization,” Communi- 1, no. 1, pp. 3–18, 2011. cations in Nonlinear Science and Numerical Simulation,vol.18, no. 2, pp. 327–340, 2013. [43] M. Friedman, “A comparison of alternative tests of significance for the problem of 𝑚 rankings,” The Annals of Mathematical [23] S. Talatahari, X. S. Yang, R. Sheikholeslami, and A. H. Gandomi, Statistics,vol.11, no.1,pp. 86–92, 1940. “Optimal parameter estimation for muskingum model using hybrid CSS and PSO method,” Journal of Computational and [44] R. Brits, A. P. Engelbrecht, and F. van den Bergh, “Locating Theoretical Nanoscience .Inpress. multiple optima using particle swarm optimization,” Applied [24] D. Simon, “Biogeography-based optimization,” IEEE Transac- Mathematics and Computation,vol.189,no. 2, pp.1859–1883, tions on Evolutionary Computation,vol.12, no.6,pp. 702–713, [45] K. Tang, X. Li, P. N. Suganthan, Z. Yang, and T. Weise, “Bench- [25] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, and M. Shao, mark functions for the CEC’2010 special session and competi- tion on large scale global optimization,” Nature Inspired Com- “Hybridizing harmony search with biogeography based opti- mization for global numerical optimization,” JournalofCom- putation and Applications Laboratory, USTC, Hefei, China, putational and eor Th etical Nanoscience .Inpress. 2010. [26] Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic [46] Q.-K.Pan,P.N.Suganthan,J.J.Liang,and M. F. Tasgetiren,“A optimization algorithm: harmony search,” Simulation,vol.76, local-best harmony search algorithm with dynamic subpopula- no. 2, pp. 60–68, 2001. tions,” Engineering Optimization,vol.42, no.2,pp. 101–117, 2010. [27] G. Wang,L.Guo,H.Duan, H. Wang,L.Liu,and J. Li,“Incor- [47] Q.-K.Pan,P.N.Suganthan,J.J.Liang,and M. F. Tasgetiren, porating mutation scheme into krill herd algorithm for global “A local-best harmony search algorithm with dynamic sub- numerical optimization,” Neural Computing and Applications, harmony memories for lot-streaming flow shop scheduling problem,” Expert Systems with Applications,vol.38, no.4,pp. 3252–3259, 2011. [28] X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, Frome, UK, 2nd edition, 2010. [48] Q.-K.Pan,P.N.Suganthan,M.F.Tasgetiren, andJ.J.Liang, [29] A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari, “Bat “A self-adaptive global best harmony search algorithm for algorithm for constrained optimization tasks,” Neural Comput- continuous optimization problems,” Applied Mathematics and ing & Applications.Inpress. Computation,vol.216,no. 3, pp.830–848,2010. Journal of Applied Mathematics 21 [49] S. M. Islam, S. Das, S. Ghosh, S. Roy, and P. N. Suganthan, “An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization,” IEEE Transactions on Systems, Man, and Cybernetics B,vol.42, no. 2, pp. 482–500, 2012. [50] J. J. Liang, A. K. Qin, P. N. Suganthan, and S. Baskar, “Compre- hensive learning particle swarm optimizer for global optimiza- tion of multimodal functions,” IEEE Transactions on Evolution- ary Computation,vol.10, no.3,pp. 281–295, 2006. [51] S. Z. Zhao, P. N. Suganthan, Q.-K. Pan, and M. Fatih Tasge- tiren, “Dynamic multi-swarm particle swarm optimizer with harmony search,” Expert Systems with Applications,vol.38, no. 4, pp. 3735–3742, 2011. [52] R. Mallipeddi and P. Suganthan, Problem Definitions and Eval- uation Criteria for the CEC 2010 Competition on Constrained Real-Parameter Optimization, Nanyang Technological Univer- sity,Singapore,2010. Advances in Advances in Journal of Journal of Operations Research Decision Sciences Applied Mathematics Algebra Probability and Statistics Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 The Scientific International Journal of World Journal Die ff rential Equations Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 Submit your manuscripts at http://www.hindawi.com International Journal of Advances in Combinatorics Mathematical Physics Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 Journal of Journal of Abstract and Discrete Dynamics in Mathematical Problems Complex Analysis Mathematics in Engineering Applied Analysis Nature and Society Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 International Journal of Journal of Mathematics and Discrete Mathematics Mathematical Sciences Journal of International Journal of Journal of Function Spaces Stochastic Analysis Optimization Hindawi Publishing Corporation Hindawi Publishing Corporation Volume 2014 Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Applied Mathematics Unpaywall

A Novel Hybrid Bat Algorithm with Harmony Search for Global Numerical Optimization

Journal of Applied MathematicsJan 1, 2013

Loading next page...
 
/lp/unpaywall/a-novel-hybrid-bat-algorithm-with-harmony-search-for-global-numerical-T0AhyO0CDo

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Unpaywall
ISSN
1110-757X
DOI
10.1155/2013/696491
Publisher site
See Article on Publisher Site

Abstract

Hindawi Publishing Corporation Journal of Applied Mathematics Volume 2013, Article ID 696491, 21 pages http://dx.doi.org/10.1155/2013/696491 Research Article A Novel Hybrid Bat Algorithm with Harmony Search for Global Numerical Optimization 1,2 1 Gaige Wang and Lihong Guo Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences, Changchun 130033, China Graduate School of Chinese Academy of Sciences, Beijing 100039, China Correspondence should be addressed to Lihong Guo; guolh@ciomp.ac.cn Received 27 June 2012; Revised 27 November 2012; Accepted 15 December 2012 Academic Editor: Marco H. Terra Copyright © 2013 G. Wang and L. Guo. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. A novel robust hybrid metaheuristic optimization approach, which can be considered as an improvement of the recently developed bat algorithm, is proposed to solve global numerical optimization problems. The improvement includes the addition of pitch adjustment operation in HS serving as a mutation operator during the process of the bat updating with the aim of speeding up convergence, thus making the approach more feasible for a wider range of real-world applications. eTh detailed implementation procedure for this improved metaheuristic method is also described. Fourteen standard benchmark functions are applied to verify the eeff cts of these improvements, and it is demonstrated that, in most situations, the performance of this hybrid metaheuristic method (HS/BA) is superior to, or at least highly competitive with, the standard BA and other population-based optimization methods, such as ACO, BA, BBO, DE, ES, GA, HS, PSO, and SGA. eTh eeff ct of the HS/BA parameters is also analyzed. 1. Introduction for making balance between randomization (global search) and local search [2]. The process of optimization is searching a vector in a function Inspired by nature, these strong metaheuristic algorithms that produces an optimal solution. All of feasible values are applied to solve NP-hard problems, such as UCAV are available solutions, and the extreme value is optimal path planning [11–14], test-sheet composition [15], WSN solution. In general, optimization algorithms are applied to deployment [16] and water, and geotechnical and transport solve these optimization problems. A simple classification engineering [17]. Optimization algorithms cover all searching way for optimization algorithms is considering the nature for extreme value problems. These kinds of metaheuristic of the algorithms, and optimization algorithms can be algorithms carry out on a population of solutions and always divided into two main categories: deterministic algorithms find best solutions. During the 1950s and 1960s, computer sci- and stochastic algorithms. Deterministic algorithms using entists studied the possibility of conceptualizing evolution as gradients such as hill climbing have a rigorous move and an optimization tool, and this generated a subset of gradient will generate the same set of solutions if the iterations free approaches named genetic algorithms (GA) [18]. Since commence with the same initial starting point. On the other then, many other nature-inspired metaheuristic algorithms hand, stochastic algorithms without using gradients oen ft have emerged, such as differential evolution (DE) [ 10, 19, 20], generate different solutions even with the same initial value. particle swarm optimization (PSO) [21–23], biogeography- However, generally speaking, the na fi l values, though slightly based optimization (BBO) [24, 25], harmony search (HS) different, will converge to the same optimal solutions within a [26, 27], and more recently, the bat algorithm (BA) [28, 29] given accuracy [1]. Generally, stochastic algorithms have two that is inspired by echolocation behavior of bats in nature. types: heuristic and metaheuristic. Recently, nature-inspired metaheuristic algorithms perform powerfully and efficiently Firstly presented by Yang in 2010, the bat-inspired algo- in solving modern nonlinear numerical global optimization rithm or bat algorithm (BA) [30, 31]isametaheuristic problems. To some extent, all metaheuristic algorithms strive search algorithm, inspired by the echolocation behavior of 2 Journal of Applied Mathematics bats with varying pulse rates of emission and loudness. eTh 2. Preliminary primary purpose for a bat’s echolocation is to act as a signal To begin with, in this section we will provide a brief back- system to sense distance and hunt for food/prey. A com- ground on the optimization problem, harmony search (HS), prehensive review about swarm intelligence involving BA is and bat algorithm (BA). performed by Parpinelli and Lopes [32]. Furthermore, Tsai et al. proposed an evolving bat algorithm (EBA) to improve the 2.1. Optimization Problem. In computer science, mathemat- performance of BA with better efficiency [ 33]. ics, management science, and control theory, optimization FirstlyproposedbyGeemetal. in 2001,harmony search (also called mathematical optimization or mathematical pro- (HS) [26] is a new metaheuristic approach for minimizing gramming) means the selection of an optimal solution from possibly nondifferentiable and nonlinear functions in con- some set of feasible alternatives. In general, an optimization tinuous space. HS is inspired by the behavior of a musician’s problem includes minimizing or maximizing a function by improvisation process, where every musician attempts to systematically selecting input values from a given feasible improve its tune so as to create optimal harmony in a real- set and calculating the value of the function. More gener- world musical performance processes. HS algorithm origi- ally, optimization consists of ndin fi g the optimal values of nates in the similarity between engineering optimization and some objective function within a given domain, including a music improvisation, and the engineers search for a global number of different types of domains and different types of optimal solution as determined by an objective function, objective functions [34]. just like the musicians strive for ndin fi g aesthetic harmony A global optimization problem can be described as as determined by aesthetician. In music improvisation, each follows. musician chooses any pitch within the feasible range, together producing one harmony vector. If all the pitches produce a Given: a function 𝑓 :𝐷→𝑅 from some set 𝐷 to the good solution, that experience is reserved in each variable’s real numbers. memory, and the possibility of producing a better solution Sought: a parameter 𝑥 in 𝐷 such that 𝑓(𝑥 )≤𝑓(𝑥) is also increased next time. Furthermore, this new approach 0 0 for all 𝑥 in 𝐷 (“minimization”) or such that 𝑓(𝑥 )≥ requires few control variables, which makes HS easy to 0 𝑓(𝑥) for all 𝑥 in 𝐷 (“maximization”). implement, more robust, and very appropriate for parallel computation. Such a formulation is named a numerical optimization BA is a powerful algorithm in exploitation (i.e., local problem. Many theoretical and practical problems may be search), but at times it may trap into some local optima, so modeled in this general framework. In general, 𝐷 is some that it cannot perform global search well. For bat algorithm, 𝑛 subset of the Euclidean space 𝑅 , oeft n specified by a group the search depends completely on random walks, so a fast of constraints, equalities, or inequalities that the components convergence cannot be guaranteed. Firstly presented here, in of 𝐷 have to satisfy. The domain 𝐷 of 𝑓 is named the search order to increase the diversity of the population for BA so space, while the elements of 𝐷 are named feasible solutions as to avoid trapping into local optima, a main improvement or candidate solutions. In general, the function 𝑓 is called of adding pitch adjustment operation in HS serving as a an objective function, cost function (minimization), or utility mutation operator is made to the BA with the aim of speeding function (maximization). An optimal solution is an available up convergence, thus making the approach more feasible solution that is the extreme of (minimum or maximum) the for a wider range of practical applications while preserving objective function. the attractive characteristics of the basic BA. aTh t is to Conventionally, the standard formulation of an optimiza- say, we combine two approaches to propose a new hybrid tion problem is stated in terms of minimization. In general, metaheuristic algorithm according to the principle of HS and unless both the feasible region and the objective function are BA, and then this improved BA method is used to search convex in a minimization problem, there may be more than theoptimal objectivefunctionvalue.Theproposedapproach ∗ one local minima. A local minimum 𝑥 is defined as a point is evaluated on fourteen standard benchmark functions that for which the following expression have ever been applied to verify optimization algorithms on continuous optimization problems. Experimental results 𝑓 (𝑥 )≤𝑓 (𝑥 ) (1) show that the HS/BA performs more ecffi iently and accu- rately than basic BA, ACO, BBO, DE, ES, GA, HS, PSO, and holds. More details about the optimization problem can be SGA. found in [35]. eTh structureofthispaper is organizedasfollows: The branch of numerical analysis and applied mathe- Section 2 describes global numerical optimization problem, matics that investigates deterministic algorithms that can the HS algorithm, and basic BA in brief. Our proposed guarantee convergence in limited time to the true optimal approach HS/BA is presented in detail in Section 3.Subse- solution of a nonconvex problem is called global numer- quently, our method is evaluated through fourteen bench- ical optimization problems. A variety of algorithms have mark functions in Section 4. In addition, the HS/BA is been proposed to solve nonconvex problems. Among them, also compared with BA,ACO,BBO,DE, ES,GA, HS,PSO heuristics algorithms can evaluate approximate solutions to and SGA. Finally, Section 5 consists of the conclusion and some optimization problems, as described in introduction proposals for future work. [36]. Journal of Applied Mathematics 3 2.2. Harmony Search. Firstly developed by Geem et al. in localrange of pitchadjustment. Actually,wecan seethatthe 2001, harmony search (HS) [26]isarelatively newmeta- pitch adjustment (3)isarandom walk. heuristic optimization algorithm, and it is based on natural Pitch adjustment is similar to the mutation operator in musical performance processes that occur when a musician GA. Also, we must appropriately set the parameter PAR to searches for an optimal state of harmony. eTh optimization control the degree of the adjustment. If PAR nears 1 (too operators of HS algorithm are speciefi d as the harmony high), then the solution is always changing and the algorithm memory (HM), which keeps the solution vectors which are may not converge at all. If it is very close to 0 (too low), then allwithinthe search space, as shownin(2); the harmony there is very little change and the algorithm may premature. memory size HMS, which represents the number of solution eTh refore, we use PAR = 0.1 ∼ 0.5 in most simulations as vectors kept in the HM; the harmony memory consideration usual. rate (HMCR); the pitch adjustment rate (PAR); the pitch For the purpose of increasing the diversity of the solu- adjustment bandwidth (bw) tions, the randomization is needed in the third component. Although adjusting pitch has a similar role, it is conn fi ed 1 1 1 1 𝑥 𝑥 ... 𝑥 fitness (𝑥 ) to certain local pitch adjustment and thus corresponds to 1 2 𝐷 [ ] 2 2 2 2 𝑥 𝑥 ... 𝑥 fitness (𝑥 ) a local search. eTh usage of randomization can make the [ ] 1 2 𝐷 [ ] HM = . (2) system move further to explore multifarious regions with . . . . [ ] . . . . [ ] . . ... . . high solution diversity in order to search for the global HMS HMS HMS HMS 𝑥 𝑥 ... 𝑥 fitness (𝑥 ) optimal solution. In real-world engineering applications, [ 1 2 𝐷 ] HS has been applied to solve many optimization problems In more detail, we can explain the HS algorithm with including function optimization, water distribution network, the help of discussing the improvisation process by a music groundwater modeling, energy-saving dispatch, structural player. When a player is improvising, he or she has three design, and vehicle routing. possible options: (1) play any well-known piece of music (a eTh mainframeofthe basicHSalgorithm canbe series of pitches in harmony) exactly from his or her memory described as showninAlgorithm 1,where 𝐷 is the number of as the harmony memory consideration rate (HMCR); (2) decision variables and rand is a random real number in inter- play something similar to a known piece in player’s memory val(0,1)drawnfromuniformdistribution.FromAlgorithm 1, (thus adjusting the pitch slightly); or (3) play totally new it is clear that there are only two control parameters in HS, or random pitch from feasible ranges. If these three options which are HMCR and PAR. are formalized for optimization, we have three correspond- ing components: employment of harmony memory, pitch 2.3. Bat Algorithm. The bat algorithm is a novel metaheuristic adjusting, and randomization. Similarly, when each decision swarm intelligence optimization method developed for the variable selects one value in the HS algorithm, it can make use global numerical optimization, in which the search algorithm of one of the above three rules in the whole HS procedure. If a is inspired by social behavior of bats and the phenomenon of new harmony vector is better than the worst harmony vector echolocation to sense distance. in the HM, the new harmony vector takes the place of the In [28], for simplicity, bat algorithm is based on idealizing worst harmony vector in the HM. This procedure is repeated some of the echolocation characteristics of bats, which are the until a stopping criterion is satisfied. following approximate or idealized rules: eTh employment of harmony memory is signicfi ant, as it is analogous to select the optimal tfi individuals in the GA (genetic algorithm). This will make sure that the best (1) all bats apply echolocation to sensedistance, andthey harmonies will be kept on to the new harmony memory. For always “know” thesurroundingsinsomemagical the purpose of using this memory more effectively, we should way; properly set the value of the parameter HMCR ∈ [0, 1]. If this rate is very approaching to 1 (too high), almost all (2) bats yfl randomly with velocity V and a xfi ed fre- the harmonies are utilized in the harmony memory, then quency 𝑓 at position 𝑥 , varying wavelength 𝜆 ,and min 𝑖 other harmonies are not explored well, resulting in potentially loudness 𝐴 to hunt for prey. They can spontaneously wrong solutions. If this rate is very close to 0 (extremely low), accommodate the wavelength (or frequency) of their only few best harmonies are chosen, and it may have a slow emittedpulsesand adjust therateofpulse emission convergence rate. Therefore, generally, HMCR = 0.7 ∼ 0.95. 𝑟∈[0,1] , depending on the proximity of their target; To adjust thepitch slightly in thesecondcomponent,an appropriate approach is to be applied to adjust the frequency (3) although the loudness can change in different ways, it ecffi iently. In theory, we can adjust the pitch linearly or is supposed that the loudness varies from a minimum nonlinearly, but in fact, linear adjustment is utilized. If 𝑥 is constant (positive) 𝐴 to a large 𝐴 . old min 0 the current solution (or pitch), then the new solution (pitch) 𝑥 is generated by new Based on these approximations and idealization, the basic steps of the bat algorithm (BA) can be described as shown in 𝑥 =𝑥 + bw (2𝜀 − 1 ), (3) new old Algorithm 2. 𝑡 𝑡 In BA,eachbat is denfi edbyits position 𝑥 ,velocity V , where 𝜀 is arandomrealnumberdrawn from auniform 𝑖 𝑖 𝑡 𝑡 distribution [0, 1].Herebwisthe bandwidth, controllingthe frequency 𝑓 , loudness 𝐴 , and the emission pulse rate 𝑟 𝑖 𝑖 4 Journal of Applied Mathematics Begin Step 1: Set the parameters and initialize the HM. Step 2: Evaluate the tfi ness for each individual in HM Step 3: while the halting criteria is not satisfied do for 𝑑=1 :𝐷 do if rand < HMCR then // memory consideration 𝑥 (𝑑 )=𝑥 (𝑑 )where 𝑎∈ (1,2,..., HMS) new 𝑎 if rand < PAR then // pitch adjustment 𝑥 (𝑑 )=𝑥 (𝑑 )+ bw × (2× rand −1) new old endif else // random selection 𝑥 (𝑑) = 𝑥 + rand ×(𝑥 −𝑥 ) new min,𝑑 max,𝑑 min−𝑑 endif endfor 𝑑 Update the HM as 𝑥 =𝑥 ,if 𝑓(𝑥 )<𝑓(𝑥 )(minimization objective) 𝑤 new new 𝑤 Update the best harmony vector found so far Step 4: end while Step 5: Post-processing the results and visualization. End. Algorithm 1: Harmony search algorithm. Begin Step 1: Initialization. Set the generation counter 𝑡=1 ; Initialize the population of NP bats 𝑃 randomly and each bat corresponding to a potential solution to the given problem; define loudness 𝐴 , pulse frequency 𝑄 𝑖 𝑖 and the initial velocities V (𝑖 = 1,2,...,NP);set pulserate 𝑟 . 𝑖 𝑖 Step 2: While the termination criteria is not satisfied or 𝑡< MaxGeneration do Generate new solutions by adjusting frequency, and updating velocities and locations/solutions [(4)–(6)] if (rand >𝑟 ) then Select a solution among the best solutions; Generate a local solution around the selected best solution end if Generate a new solution by yfl ing randomly if (rand <𝐴 & 𝑓(𝑥 )<𝑓(𝑥 )) then 𝑖 𝑖 ∗ Accept the new solutions Increase 𝑟 and reduce 𝐴 𝑖 𝑖 end if Rank the bats and find the current best 𝑥 𝑡=𝑡 + 1 ; Step 3: end while Step 4: Post-processing the results and visualization. End. Algorithm 2: Bat algorithm. in a 𝐷 -dimensional search space. The new solutions 𝑥 and all the 𝑛 bats. Generally speaking, depending on the domain size of the problem of interest, the frequency 𝑓 is assigned velocities V at time step t are given by to 𝑓 =0 and 𝑓 = 100 in practical implementation. min max 𝑓 =𝑓 +(𝑓 −𝑓 )𝛽, (4) 𝑖 min max min Initially, each bat is randomly given a frequency which is drawn uniformly from [𝑓 , 𝑓 ]. 𝑡 𝑡−1 𝑡 min max V = V +(𝑥 −𝑥 )𝑓 , (5) ∗ 𝑖 𝑖 𝑖 𝑖 For the local search part, once a solution is selected among the current best solutions, a new solution for each bat 𝑡 𝑡−1 𝑡 𝑥 =𝑥 + V , (6) 𝑖 𝑖 𝑖 is generated locally using random walk where 𝛽∈[0,1] is arandomvectordrawn from auniform distribution. Here 𝑥 is the current global best location (solu- tion) which is located after comparing all the solutions among 𝑥 =𝑥 +𝜀𝐴 , (7) new old Journal of Applied Mathematics 5 where 𝜀 ∈ [−1, 1] is a scaling factor which is a random num- from good solutions. Secondly, the mutation operator can ber, while 𝐴 =⟨𝐴 ⟩ is the average loudness of all the bats at improve the exploration of the new search space. In this way, time step 𝑡 . the strong exploration abilities of the original HS and the eTh updateofthevelocitiesandpositionsofbatsissimilar exploitation abilities of the BA can be fully developed. to the procedure in the standard PSO [21], as 𝑓 controls the For bat algorithm, as the search relies entirely on random pace and range of the movement of the swarming particles in walks, a fast convergence cannot be guaranteed. Described essence. To some degree, BA can be considered as a balanced here for the first time, a main improvement of adding combination of the standard PSO and the intensive local mutation operator is made to the BA, including three minor search controlled by the loudness and pulse rate. improvements, which are made with the aim of speeding up Furthermore, the loudness 𝐴 and the rate 𝑟 of pulse convergence, thus making the method more practical for a 𝑖 𝑖 emission update accordingly as the iterations proceed as wider range of applications, but without losing the attractive shown in (8) features of the original method. The first improvement is that we use fixed frequency 𝑡+1 𝑡 𝐴 =𝛼𝐴 , 𝑓 and loudness 𝐴 instead of various frequency 𝑓 and 𝐴 . 𝑖 𝑖 𝑖 𝑖 (8) Similar to BA, in HS/BA, each bat is defined by its position 𝑥 , 𝑡+1 0 𝑡 𝑡 𝑟 =𝑟 [1 − exp (−𝛾)] 𝑡 , 𝑖 𝑖 velocity V ,the emission pulserate 𝑟 and the xfi ed frequency 𝑖 𝑖 𝑓 , and loudness 𝐴 in a 𝑑 -dimensional search space. The new where 𝛼 and 𝛾 are constants. In essence, 𝛼 is similar to 𝑡 𝑡 solutions 𝑥 and velocities V at time step 𝑡 are given by 𝑖 𝑖 the cooling factor of a cooling schedule in the simulated annealing (SA) [37]. For simplicity, we set 𝛼=𝛾=0.9 in 𝑡 𝑡−1 𝑡 this work. V = V +(𝑥 −𝑥 )𝑓, 𝑖 𝑖 𝑖 (9) 𝑡 𝑡−1 𝑡 𝑥 =𝑥 + V , 3. Our Approach: HS/BA 𝑖 𝑖 𝑖 Basedonthe introduction of HS andBAinprevioussection, we will explain how we combine the two approaches to form where 𝑥 is the current global best location (solution) which the proposed bat algorithm with harmony search (HS/BA) in is located after comparing all the solutions among all the 𝑛 this section, which modifies the solutions with poor tfi ness in bats. In our experiments, we make 𝑓=0.5 .Throughaseries order to add diversity of the population to improve the search of simulation experiments on benchmarks in Section 4.2,it efficiency. was found that setting the parameter of pulse rate 𝑟 to 0.6 and In general, the standard BA algorithm is adept at exploit- the loudness 𝐴 to 0.95 produced the best results. ing the search space, but at times it may trap into some The second improvement is to add mutation operator in local optima, so that it cannot perform global search well. an attempt to increase diversity of the population to improve For BA, the search depends completely on random walks, so the search efficiency and speed up the convergence to optima. a fast convergence cannot be guaranteed. Firstly presented Forthelocalsearchpart,onceasolutionisselectedamongthe here, in order to increase the diversity of the population current best solutions, a new solution for each bat is generated for BA so as to avoid trapping into local optima, a main locally using random walk by (7)when 𝜉 is larger than pulse improvement of adding pitch adjustment operation in HS rate 𝑟 ,thatis, 𝜉>𝑟 ,where 𝜉∈[0,1] is a random real number serving as a mutation operator is made to the BA with the drawn from a uniform distribution; while when 𝜉≤𝑟 ,we aim of speeding up convergence, thus making the approach use pitch adjustment operation in HS serving as a mutation more feasible for a wider range of practical applications while operator updating the new solution to increase diversity of preserving the attractive characteristics of the basic BA. In the population to improve the search ecffi iency, as shown in this paper, a hybrid metaheuristic algorithm by inducing the (3). Through testing benchmarks in Section 4.2,itwas found pitch adjustment operation in HS as a mutation operator that setting the parameter of harmony memory consideration into bat algorithm, so-called harmony search/bat algorithm rate HMCR to 0.95 and pitch adjustment rate PAR to 0.1 (HS/BA), is used to optimize the benchmark functions. The produced the best results. difference between HS/BA and BA is that the mutation The last improvement is the addition of elitism scheme operator is used to improve the original BA generating a new into the HS/BA. As with other population-based optimiza- solution for each bat. In this way, this method can explore the tion algorithms, we typically incorporate some sort of elitism new search space by the mutation of the HS algorithm and in order to retain the best solutions in the population. This exploit the population information with BA, and therefore prevents the best solutions from being corrupted by pitch can avoid trapping into local optima in BA. In the following, adjustment operator. Note that we use an elitism approach to we will show the algorithm HS/BA which is an improvement save the property of the bats that has the best solution in the of HS and BA. HS/BAprocess,soevenifpitch adjustment operationruins eTh critical operator of HS/BA is the hybrid harmony its corresponding bat, we have saved it and can revert back to search mutation operator, which composes the improvisation it if needed. of harmony in HS with the BA. eTh core idea of the proposed Based on the above-mentioned analyses, the mainframe hybrid mutation operator is based on two considerations. of the harmony search/bat algorithm (HS/BA) is presented Firstly, poor solutions can take in many new used features in Algorithm 3. 6 Journal of Applied Mathematics Begin Step 1: Initialization. Set the generation counter 𝑡=1 ; initialize the population of NP bats 𝑃 randomly and each bat corresponding to a potential solution to the given problem; define loudness 𝐴 ; set frequency 𝑄 , the initial velocities V,and pulserate 𝑟 ; set the harmony memory consideration rate HMCR, the pitch adjustment rate PAR and bandwidth bw; set maximum of elite individuals retained KEEP. Step 2: Evaluate the quality 𝑓 for each bat in 𝑃 determined by the objective function 𝑓(𝑥) . Step 3: While the termination criteria is not satisfied or 𝑡< MaxGeneration do Sort the population of bats 𝑃 from best to worst by order of quality 𝑓 for each bat. Store the KEEP best bats as KEEPBAT. for 𝑖=1 :NP (all bats) do 𝑡 𝑡−1 𝑡 V = V +(V −𝑥 )×𝑄 𝑖 𝑖 𝑖 𝑡 𝑡−1 𝑡 𝑥 =𝑥 + V 𝑖 𝑖 𝑖 if (rand >𝑟 ) then 𝑡 𝑡 𝑥 =𝑥 +𝛼𝜀 end if for 𝑗=1 :𝐷 (all elements) do //Mutate if rand < HMCR then ( ) 𝑟 = ⌈NP ∗ rand⌉ 𝑥 (𝑗) = 𝑥 (𝑗)where 𝑟 ∈ 1,2,..., HMS ( ) V 𝑟 1 if (rand < PAR) then 𝑥 (𝑗) = 𝑥 (𝑗) +bw × (2× rand −1) V V endif else 𝑥 (𝑗) = 𝑥 + rand ×(𝑥 −𝑥 ) V min,𝑗 max,𝑗 min−𝑗 endif endfor 𝑗 𝑡 𝑡 𝑡 Evaluate the tn fi ess for the offsprings 𝑥 , 𝑥 , 𝑥 𝑢 𝑖 V 𝑡 𝑡 𝑡 𝑡 Select the offspring 𝑥 with the best tn fi ess among the offsprings 𝑥 , 𝑥 , 𝑥 . 𝑘 𝑢 𝑖 V if (rand <𝐴 ) then 𝑡 𝑡 𝑥 =𝑥 ; 𝑟 𝑘 end if Replace the KEEP worst bats with the KEEP best bats KEEPBAT stored. end for 𝑖 𝑡=𝑡 + 1 ; Step 4: end while Step 5: Post-processing the results and visualization; End. Algorithm 3: eTh hybrid meta-heuristic algorithm of HS/BA. 4. Simulation Experiments areACO,BA, BBO, DE,ES, GA,HS, PSO, andSGA.ACO (ant colony optimization) [38, 39] is a swarm intelligence In this section, we test the performance of the proposed meta- algorithm for solving optimization problems which is based heuristic HS/BA to global numerical optimization through a on the pheromone deposition of ants. BBO (biogeography- series of experiments conducted on benchmark functions. based optimization) [24]isanewexcellent powerfuland To allow a fair comparison of running times, all the exper- efficient evolution algorithm, developed for the global opti- iments were conducted on a PC with a Pentium IV processor mization inspired by the immigration and emigration of running at 2.0 GHz, 512 MB of RAM and a hard drive species between islands (or habitats) in search of more of 160 Gbytes. Our implementation was compiled using compatible islands. DE (differential evolution) [ 19]isasimple MATLAB R2012a (7.14) running under Windows XP3. No but excellent optimization method that uses the dieff rence commercial BA or HS tools were used in the following between two solutions to probabilistically adapt a third experiments. solution. An ES (evolutionary strategy) [40] is an algorithm that generally distributes equal importance to mutation and 4.1. General Performance of HS/BA. In order to explore recombination and that allows two or more parents to the benefits of HS/BA, in this subsection we compared its reproduce an offspring. A GA (genetic algorithm) [ 18]isa performance on global numeric optimization problem with search heuristic that mimics the process of natural evolution. nine other population-based optimization methods, which PSO (particle swarm optimization) [21] is also a swarm Journal of Applied Mathematics 7 intelligence algorithm which is based on the swarm behavior normalizations in the tables are based on different scales, so of sfi hand bird schoolinginnature. Astudgenetic algorithm values cannot be compared between the two tables. (SGA) [41] is a GA that uses the best individual at each From Table 3, we see that, on average, HS/BA is the most generation for crossover. effective at finding objective function minimum on ten of In all experiments, we will use the same parameters for the fourteen benchmarks (F02, F03, F06–F11, and F13-F14). HS, BA and HS/BA that are loudness 𝐴 = 0.95 ,pulse rate ACO is the second most effective, performing the best on the 𝑟 = 0.6 ,scaling factor 𝜀 = 0.1 , the harmony memory benchmarks F04 and F05. BBO and SGA are the third most consideration rate HMCR = 0.95, and the pitch adjustment effective, performing the best on the benchmarks F12 and rate PAR = 0.1.For ACO, BBO, DE,ES, GA,PSO,and SGA, F01, respectively, when multiple runs are made. Table 4 shows we set the parameters as follows. For ACO, initial pheromone that HS/BA performs the best on thirteen of the fourteen value 𝜏 =1𝐸 − 6 , pheromone update constant 𝑄=20 , benchmarks (F02–F14), while BBO performs the second exploration constant 𝑞 =1, global pheromone decay rate 0 best at finding objective function minimum on the only 𝜌 = 0.9, local pheromone decay rate 𝜌 = 0.5, pheromone 𝑔 𝑙 benchmark (F01) when multiple runs are made. In addition, sensitivity 𝑠=1 , and visibility sensitivity 𝛽=5 ;for BBO, statistical analysis [42] on these values obtained by the ten habitat modification probability = 1, immigration probability methods on 14 benchmark functions based on the Friedman’s bounds per gene = [0, 1], step size for numerical integration of test [43] reveals that the dieff rences in the obtained average probabilities = 1, maximum immigration and migration rates and the best function minima are statistically signicfi ant ( 𝑃= −17 −17 for each island = 1, and mutation probability = 0.005; for DE, 1.66 ∗ 10 and 𝑃 = 7.25 ∗ 10 , resp.) at the confidence level aweighting factor 𝐹 = 0.5 and a crossover constant CR = 0.5; of 5%. for the ES, the number of offspring 𝜆=10 produced in Furthermore, the computational requirements of the ten each generation and standard deviation 𝜎=1 for changing optimization methods were similar. We collected the average solutions. For the GA, we used roulette wheel selection and computational time of the optimization methods as applied single-point crossover with a crossover probability of 1 and to the 14 benchmarks discussed in this section. eTh results are a mutation probability of 0.01. For PSO, we set an inertial showninTable 3. BA was the quickest optimization method. constant = 0.3, a cognitive constant = 1, and a social constant HS/BA was the third fastest of the ten algorithms. However, for swarm interaction = 1. For the SGA, we used single-point it should be noted that in the vast majority of real-world crossover with a crossover probability of 1 and a mutation applications, it is the tfi ness function evaluation that is by far probability of 0.01. the most expensive part of a population-based optimization Well-defined problem sets are favorable for evaluating algorithm. the performance of optimization methods proposed in this Furthermore, convergence graphs of ACO, BA, BBO, paper. Based on mathematical functions, benchmark func- DE, ES, GA, HS, HS/BA, PSO, and SGA are shown in tions can be applied as objective functions to perform such Figures 1–14 which represent the process of optimization. eTh tests. The properties of these benchmark functions can be values shown in Figures 1–14 are the mean objective function easily achieved from their definitions. Fourteen different optimumachievedfrom100 MonteCarlo simulations, which benchmark functions are applied to verify our proposed are the true objective function values, not normalized. metaheuristic algorithm HS/BA. Each of the functions in this Figure 1 showsthe resultsobtainedfor theten methods study has 20 independent variables (i.e., 𝐷=20 ). when the F01 Ackley function is applied. From Figure 1, The benchmark functions described in Table 1 are stan- clearly, we can draw the conclusion that BBO is significantly dard testing functions. eTh properties of the benchmark func- superior to all the other algorithms including HS/BA during tionsare giveninTable 2.Themodalitypropertymeans the the process of optimization, while HS/BA performs the sec- number of the best solutions in the search space. Unimodal ond best on this multimodal benchmark function. Here, all benchmark functions only have an optimum, which is the the other algorithms show the almost same starting point, and global optimum. Multimodal benchmark functions have at HS/BA has a faster convergence rate than other algorithms, least two optima in their search space, indicating that they while HS/BA is outperformed by BBO aeft r 13 generations. have more than one local optima except the global optimum. Although slower, SGA eventually finds the global minimum More details of all the benchmark functions can be found in close to HS/BA. Obviously, BBO, HS/BA, and SGA outper- [6]. form ACO, BA, DE, ES, GA, HS, and PSO during the whole We set population size = 50 ,anelitism parameter process of searching the global minimum. 𝑝𝐾𝑒𝑒 = 2 , and maximum generation𝑛=𝑔𝑒𝑥 50 for each algorithm. We ran 100 Monte Carlo simulations of each Figure 2 illustrates the optimization results for F02 Fletc- her-Powell function. In this multimodal benchmark problem, algorithm on each benchmark function to get representative performances. Tables 3 and 4 illustrate the results of the it is obvious that HS/BA outperforms all other methods dur- ing the whole progress of optimization (except the generation simulations. Table 3 shows the average minima found by each algorithm, averaged over 100 Monte Carlo runs. Table 4 20–33, BBO, and HS/BA performs approximately equally). shows the absolute best minima found by each algorithm Figure 3 shows the optimization results for F03 Griewank function. From Table 3 and Figure 3,wecan seethatHS/BA over 100 Monte Carlo runs. In other words, Table 3 shows the average performance of each algorithm, while Table 4 performs thebestonthismultimodalbenchmark problem. shows the best performance of each algorithm. eTh best value Looking at Figure 3 carefully, HS/BAshows afasterconver- gencerateinitially than ACO; however, it is outperformed achieved for each test problem is shown in bold. Note that the 𝑀𝑎 𝑁𝑃 𝑖𝑗 𝑖𝑗 𝑖𝑗 𝑖𝑗 8 Journal of Applied Mathematics 𝑥𝑛 󵄨 󵄨 󵄨 󵄨 𝑖 𝑥 𝑖 󵄨 󵄨 󵄨 󵄨 𝑖 𝑖 𝑥 𝑥 𝑗 𝐷 𝑒 𝑛 𝑥 𝑖 𝑖 𝑈 𝑛 󵄨 󵄨 󵄨 󵄨 𝑖 𝑥 𝑥 𝑛 𝑗 𝑖 𝑖 𝑥 𝑥 󵄨 󵄨 󵄨 󵄨 𝑛 𝑛 𝑒 𝑖 𝑛 𝑛 𝑛 𝑛 ⃗ ⃗ ⃗ ⃗ 𝑥 𝑥 ⃗ ⃗ ⃗ ⃗ ⃗ ⃗ ⃗ 𝑓 𝐵 𝑥 𝑥 𝑓 𝑓 𝑓 𝑓 Table 1: Benchmark functions. No. Name Definition Source −0.2⋅ (1/𝑛) ∑ √ (1/𝑛) ∑ cos(2𝜋 ) F01 Ackley [3] 𝑖=1 𝑖=1 𝑓( 𝑥) = 20 + − 20 ⋅ −𝑒 ( )= ∑(𝐴 −𝐵 ),𝐴 = ∑ (𝑎 sin +𝑏 cos ) 𝑖=1 𝑗=1 [4] F02 Fletcher-Powell = ∑ (𝑎 sin +𝑏 cos ) 𝑗=1 [5] F03 Griewank 𝑓( 𝑥) = ∑ − ∏ cos ( )+1 4000 √ 𝑖=1 𝑖=1 𝑛−1 2 2 2 2 ( )= {10 sin (𝜋 )+∑ (𝑦 −1) ⋅[1+10sin (𝜋 )] +(𝑦 −1)} 1 𝑖+1 30 [6] 𝑖=1 F04 Penalty #1 +∑𝑢( , 10, 100, 4) , = 1 + 0.25 (𝑥 +1) 𝑖=1 𝑛−1 2 2 2 2 2 ( )= 0.1 { sin (3𝜋 )+∑ (𝑥 −1) ⋅[1+ sin (3𝜋 )] + ( −1) [1 + sin (2𝜋 )]} 1 𝑖+1 𝑖=1 [6] F05 Penalty #2 +∑𝑢( , 5, 100, 4) 𝑖=1 Quartic with 𝑓( 𝑥) = ∑(𝑖 ⋅ + (0, 1)) F06 [7] noise 𝑖=1 𝑓( 𝑥) = 10 ⋅ 𝑛+ ∑ (𝑥 −10⋅ cos (2𝜋 )) F07 Rastrigin [8] 𝑖=1 𝑛−1 F08 Rosenbrock 𝑓( 𝑥) = ∑ [100(𝑥 −𝑥 ) +( −1)] [7] 𝑖+1 𝑖=1 1/2 F09 Schwefel 2.26 [9] ( )= 418.9829 × − ∑ sin ( ) 𝑖=1 F10 Schwefel 1.2 [10] 𝑓( 𝑥) = ∑ ( ∑ ) 𝑖=1 𝑗=1 𝑓( 𝑥) = ∑|𝑥 |+ ∏|𝑥 | F11 Schwefel 2.22 [10] 𝑖=1 𝑖=1 ( )= max { , 1 ≤𝑖≤𝑛} F12 Schwefel 2.21 [6] Journal of Applied Mathematics 9 ⃗ ⃗ Table 1: Continued. No. Name Definition Source F13 Sphere 𝑓( 𝑥) = ∑ [7] 𝑖=1 𝑓( 𝑥) = 6 ⋅ 𝑛+ ∑ ⌊ ⌋ F14 Step [7] 𝑖=1 In benchmark function F02, the matrix elements a , b ∈(−100,100), ∈(−,) are drawn from uniform distribution [4]. 𝑛×𝑛 𝑛×𝑛 𝑛×1 In benchmark functions F04 and F05, the definition of the function 𝑢(𝑥 ,𝑎,𝑘,𝑚) is given by𝑢(𝑥 ,𝑎,𝑘,𝑚)=( −𝑎) ,𝑥 >;0,−≤ ≤;(−𝑥 −𝑎) ,𝑥 <− . 10 Journal of Applied Mathematics Table 2: Properties of benchmark functions; lb denotes lower bound, ub denotes upper bound, and opt denotes optimum point. No. Function lb ub opt Continuity Modality F01 Ackley −32.768 32.768 0 Continuous Multimodal F02 Fletcher-Powell −𝜋 𝜋 0Continuous Multimodal F03 Griewank −600 600 0 Continuous Multimodal F04 Penalty #1 −50 50 0 Continuous Multimodal F05 Penalty #2 −50 50 0 Continuous Multimodal F06 Quartic with noise −1.28 1.28 1 Continuous Multimodal F07 Rastrigin −5.12 5.12 0 Continuous Multimodal F08 Rosenbrock −2.048 2.048 0 Continuous Unimodal F09 Schwefel 2.26 −512 512 0 Continuous Multimodal F10 Schwefel 1.2 −100 100 0 Continuous Unimodal F11 Schwefel 2.22 −10 10 0 Continuous Unimodal F12 Schwefel 2.21 −100 100 0 Continuous Unimodal F13 Sphere −5.12 5.12 0 Continuous Unimodal F14 Step −5.12 5.12 0 Discontinuous Unimodal Table 3: Mean normalized optimization results in fourteen benchmark functions. eTh values shown are the minimum objective function values found by each algorithm, averaged over 100 Monte Carlo simulations. ACO BA BBO DEES GA HS HSBA PSO SGA F01 2.31 3.33 1.15 2.02 3.38 2.72 3.47 1.09 2.66 1.00 F02 24.58 25.82 1.58 8.94 24.35 5.45 15.69 1.00 13.96 1.33 F03 3.16 60.72 1.93 5.44 23.85 3.22 77.22 1.00 25.77 1.42 F04 1.00 3.0E38 4.0E32 5.6E33 2.7E38 3.1E32 1.4E39 2.3E32 4.1E36 9.6E31 F05 1.00 1.1E8 299.42 1.5E6 4.6E85.4E3 4.1E8 215.51 5.5E7 111.10 F06 489.01 6.8E3 35.32 308.29 1.8E4 274.83 1.5E4 1.00 2.5E3 10.09 F07 8.09 11.55 1.28 6.56 11.87 6.17 10.22 1.00 8.44 1.80 F08 42.25 29.01 2.29 7.59 59.99 9.05 47.85 1.00 12.04 2.15 F09 3.17 20.26 1.99 13.58 13.33 1.81 19.92 1.00 17.61 1.15 F10 1.75 3.73 1.38 2.95 4.93 1.25 4.22 1.00 2.48 1.48 F11 1.05 19.70 1.83 7.14 23.12 11.13 19.45 1.00 13.22 2.46 F12 1.86 4.03 1.00 2.99 3.91 1.92 3.74 1.38 2.38 1.12 F13 98.30 150.84 3.80 19.03 226.52 47.74 182.32 1.00 72.91 4.02 F14 7.73 120.48 3.93 13.31 102.56 11.53 146.55 1.00 63.44 3.28 TimE 2.74 1.00 1.32 1.64 1.67 1.79 2.33 1.43 2.03 1.76 The values are normalized so that the minimum in each row is 1.00. es Th e are not the absolute minima found by each algorithm, but the average minima found by each algorithm. by ACO after 5 generations. For other algorithms, although superior to all the other algorithms during the process of slower, BBO, GA, and SGA eventually find the global mini- optimization. Here, PSO shows a faster convergence rate mum close to HS/BA, while BA, DE, ES, HS, and PSO fail to initially than HS/BA; however, it is outperformed by HS/BA search the global minimum within the limited iterations. aer ft 3 generations. Figure 4 showsthe resultsfor F04Penalty #1 function. Figure 6 showsthe resultsachievedfor theten methods From Figure 4, it is obvious that HS/BA outperforms all other when using the F06 Quartic (with noise)function. From methods during the whole progress of optimization in this Table 3 and Figure 6, we can conclude that HS/BA performs multimodal benchmark function. Although slower, SGA and the best in this multimodal function. Looking carefully BBO perform the second and third best at n fi ding the global at Figure 6, PSO and HS/BA show the same initial fast minimum. convergence towards the known minimum, as the procedure Figure 5 shows the performance achieved for F05 Penalty proceeds, HS/BA gets closer and closer to the minimum, #2 function. For this multimodal function, similar to F04 while PSO comes into being premature and traps into the Penalty #1 function shown in Figure 4, HS/BA is significantly local minimum. BA, ES, GA, HS, and PSO do not manage Journal of Applied Mathematics 11 Table 4: Best normalized optimization results in fourteen benchm ark functions. eTh values shown are the minimum objective function values found by each algorithm. ACO BA BBO DEES GA HS HSBA PSO SGA F01 1.85 2.31 1.00 1.43 2.25 2.03 2.30 1.05 1.91 1.04 F02 10.42 14.48 1.09 4.22 10.39 4.03 9.91 1.00 8.28 1.33 F03 2.73 46.22 1.87 4.47 21.38 8.02 43.24 1.00 16.75 1.76 F04 4.4E65.0E61.2E31.9E42.2E63.0E4 4.2E6 1.00 3.755 3.22 F05 3.0E43.2E4 51.01 386.33 1.6E4 884.46 2.6E4 1.00 4.113 4.57 F06 58.51 992.55 6.50 24.69 808.48 59.24 746.91 1.00 189.71 2.02 F07 5.71 8.53 1.25 5.13 7.94 5.23 7.47 1.00 6.00 1.68 F08 26.42 25.03 1.48 3.70 33.52 6.16 21.50 1.00 7.75 1.45 F09 2.43 8.66 1.28 4.90 5.93 2.09 7.28 1.00 7.40 1.42 F10 1.89 4.94 1.18 2.66 3.00 2.08 2.76 1.00 2.01 1.74 F11 7.74 12.61 1.21 3.36 12.14 6.01 9.60 1.00 7.81 1.62 F12 1.33 2.33 1.44 1.69 2.08 1.74 2.11 1.00 1.70 1.21 F13 28.04 56.57 2.17 5.54 60.57 19.71 52.86 1.00 20.98 2.28 F14 4.28 54.02 1.97 4.85 33.61 10.60 49.11 1.00 19.72 1.85 The values are normalized so that the minimum in each row is 1.00. es Th e are the absolute best minima found by each algorithm. 0 5 10 15 20 25 30 35 40 45 50 Number of generations GA ACO BA HS 0 10 20 30 40 50 BBO HSBA Number of generations DE PSO GA ACO ES SGA HS BA HSBA Figure 2: Comparison of the performance of the different methods BBO DE PSO for the F02 Fletcher-Powell function. ES SGA Figure 1: Comparison of the performance of the different methods for the F01 Ackley function. GA, and PSO performing the second best; another group including BA, ES, and HS performing the worst. Figure 8 shows the results for F08 Rosenbrock function. From Table 3 and Figure 8,wecan seethatHS/BA is superior to succeed in this benchmark function within maximum to the other algorithms during the optimization process in number of generations. At last, BBO, DE, and SGA converge this relatively simple unimodal benchmark function. Also, to thevalue very closetoHS/BA. we see that, similar to the F06 Quartic (with noise)function Figure 7 shows the optimization results for the F07 shown in Figure 6, PSO shows a faster convergence rate Rastrigin function. Very clearly, HS/BA has the fastest con- initially than HS/BA; however, it is outperformed by HS/BA vergence rate at finding the global minimum and significantly aer ft 4 generations and is premature to the local minimum. outperforms all other approaches. Looking carefully at Fig- Figure 9 showsthe equivalent resultsfor theF09 Schwefel ure 7, approximately, we can divide all the algorithms into 2.26 function. From Figure 9,veryclearly,thoughHS/BA is three groups: one group including BBO, HS/BA, and SGA outperformed by BBO between the generations 10–30, it has performing the best; the other group including ACO, DE, the stable convergence rate at n fi ding the global minimum Benchmark function value Benchmark function value 12 Journal of Applied Mathematics 350 10 0 5 10 15 20 25 30 35 40 45 50 Number of generations ACO GA BA HS 0 10 20 30 40 50 BBO HSBA Number of generations DE PSO ES ACO GA SGA BA HS BBO HSBA Figure 5: Comparison of the performance of the different methods DE for the F05 Penalty #2 function. PSO ES SGA Figure 3: Comparison of the performance of the different methods for the F03 Griewank function. 10 5 0 5 10 15 20 25 30 35 40 45 50 0 0 10 20 30 40 50 Number of generations Number of generations ACO GA ACO GA BA HS BA HS BBO HSBA BBO HSBA DE PSO DE PSO ES SGA ES SGA Figure 4: Comparison of the performance of the different methods Figure 6: Comparison of the performance of the different methods for the F04 Penalty #1 function. for the F06 Quartic (with noise) function. and signicfi antly outperforms all other approaches in this multimodal benchmark function. divided into three groups: one group including BBO and Figure 10 shows the results for F10 Schwefel 1.2 function. HS/BA performing the best; the other group including ACO, From Figure 10,wecan seethatHS/BA performs farbetter GA, PSO, and SGA performing the second best; another than other algorithms during the optimization process in this group including DE, ES, and HS performing the worst. relative simple unimodal benchmark function. PSO shows Figure 11 shows the results for F11 Schwefel 2.22 function. a faster convergence rate initially than HS/BA; however, Very clearly, BBO shows a faster convergence rate initially it is outperformed by HS/BA aeft r 3 generations. Looking than HS/BA; however, it is outperformed by HS/BA aeft r 40 carefully at Figure 10, akin to F07 Rastrigin function shown in generations. At last, HS/BA reaches the optimal solution sig- Figure 7, all the algorithms except BA can be approximately nicfi antly superior to other algorithms. BBO is only inferior Benchmark function value Benchmark function value Benchmark function value Benchmark function value Journal of Applied Mathematics 13 250 6000 0 10 20 30 40 50 0 10 20 30 40 50 Number of generations Number of generations ACO GA ACO GA BA HS BA HS BBO HSBA BBO HSBA DE PSO DE PSO ES SGA ES SGA Figure 9: Comparison of the performance of the different methods Figure 7: Comparison of the performance of the different methods for the F09 Schwefel 2.26 function. for the F07 Rastrigin function. 0 5 10 15 20 25 30 35 40 45 50 Number of generations ACO GA BA HS BBO HSBA 0 10 20 30 40 50 DE PSO Number of generations ES SGA ACO GA Figure 10: Comparison of the performance of the different methods BA HS for the F10 Schwefel 1.2 function. BBO HSBA DE PSO ES SGA Figure 13 shows the results for F13 Sphere function. From Figure 8: Comparison of the performance of the different methods Table 3 and Figure 13, HS/BA has the fastest convergence for the F08 Rosenbrock function. rate at finding the global minimum and outperforms all other approaches. Looking carefully at Figure 13,for BBOand SGA, we cansee that BBOhas afasterconvergence rate than SGA, to HS/BA and performs the second best in this unimodal but SGA does finally converge to the value of BBO that is function. approaching to HS/BA. Figure 12 showsthe resultsfor F12Schwefel2.21function. Figure 14 showsthe resultsfor F14Stepfunction. Appar- Veryclearly,HS/BAhasthefastestconvergencerateatfinding ently, HS/BA shows the fastest convergence rate at ndin fi g the global minimum and significantly outperforms all other the global minimum and significantly outperforms all other approaches. approaches. Looking carefully at Figure 14,for BBOand SGA, Benchmark function value Benchmark function value Benchmark function value Benchmark function value 14 Journal of Applied Mathematics 0 5 10 15 20 25 30 35 40 45 50 20 Number of generations ACO GA BA HS 0 10 20 30 40 50 BBO HSBA Number of generations DE PSO ES ACO GA SGA BA HS BBO HSBA Figure 11: Comparison of the performance of the different methods DE for the F11 Schwefel 2.22 function. PSO ES SGA Figure 13: Comparison of the performance of the different methods for the F13 Sphere function. 0 5 10 15 20 25 30 35 40 45 50 0 10 20 30 40 50 Number of generations Number of generations ACO GA ACO GA BA HS BA HS BBO HSBA BBO HSBA DE PSO DE PSO ES SGA ES SGA Figure 14: Comparison of the performance of the different methods Figure 12: Comparison of the performance of the different methods for the F14 Step function. for the F12 Schwefel 2.21 function. we can see that BBO has a faster convergence rate than SGA, convergence rate initially, while later it converges slower but SGA does finally converge to the value of BBO. and slower to the true objective function value. At last, From the above-analyses about Figures 1–14,wecan come we must point out that, in [24], Simon compared BBO to the conclusion that our proposed hybrid metaheuristic with seven state-of-the-art EAs over 14 benchmark func- algorithm HS/BA significantly outperforms the other nine tions and a real-world sensor selection problem. eTh results algorithms.Ingeneral,ACO,BBO,and SGAare only inferior demonstrated the good performance of BBO. It is also indi- to the HS/BA, and ACO, BBO, and SGA perform better rectly demonstrated that our proposed hybrid metaheuristic than HS/BA in the benchmarks F04-F05, the benchmark F12, method HS/BA is a more powerful and efficient optimiza- and the benchmark F01, respectively. Further, benchmarks tion algorithm than other population-based optimization F04, F05, F06, F08, and F10 illustrate that PSO has a faster algorithms. Benchmark function value Benchmark function value Benchmark function value Benchmark function value Journal of Applied Mathematics 15 Table 5: Best normalized optimization results in fourteen benchmark functions with different 𝐴 .Thenumbers shownare thebestresults found aeft r 100 Monte Carlo simulations HS/BA algorithm. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.66 1.33 1.55 1.40 1.00 1.33 1.21 1.22 1.16 1.13 1.14 F02 1.25E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 3.41 11.15 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F04 2.73E4 1.00 1.25E4 1.04 1.04 1.04 1.04 1.04 1.04 1.00 1.00 F05 5.03E5 5.30 1.33 1.68E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F06 1.00 2.56 5.47 4.73 17.81 9.90 2.89 6.74 9.90 2.60 5.57 F07 10.14 14.92 4.35 1.10 1.10 15.37 1.01 1.00 1.00 1.00 1.00 F08 38.49 1.08 14.06 1.08 1.08 1.08 99.01 1.01 1.00 1.00 1.00 F09 285.18 404.58 1.38 4.74 1.19 1.19 1.19 367.91 1.01 1.00 1.00 F10 665.22 4.68 1.69E3 15.27 1.18 1.18 1.18 1.18 1.21 1.00 1.00 F11 20.03 1.00 9.96 11.54 39.22 9.96 9.96 9.96 9.96 38.97 8.49 F12 10.32 1.00 13.22 3.23 14.69 2.79 2.79 2.79 2.79 2.79 19.31 F13 1.46 4.15 2.84 4.47 1.15 1.56 1.00 1.00 1.00 1.00 1.00 F14 527.52 13.57 3.95 1.01 1.15 12.91 1.00 1.00 1.00 1.00 1.00 14 2 2 3 3 5 6 7 10 10 4.2. Inu fl ence of Control Parameter. In [28], Dr. Yang con- 4.2.2. Pulse Rate: 𝑟 . Tables 7 and 8 recorded the results cluded that if the parameters in BA can be adjusted properly, performed on the benchmark problem with the pulse rate it can outperform GA, HS (harmony search) [26], and PSO. 𝑟 = 0, 0.1, 0.2, . . . , 0.9, 1.0 and xfi ed loudness 𝐴 = 0.95 , eTh choice of the control parameters is of vital importance harmony memory consideration rate HMCR = 0.95, and for different problems. To investigate the influence of the pitch adjustment rate PAR = 0.1. From Tables 7 and 8,we loudness 𝐴 ,pulse rate 𝑟 , harmony memory consideration rate can evidently conclude that HS/BA performs signicfi antly HMCR, and pitch adjustment rate PAR on the performance better on bigger 𝑟 (>0.5) than on smaller 𝑟 (<0.5), and HS/BA of HS/BA, we carry out this experiment in the benchmark performs the best when r is equal or very close to 0.6. So, we problem with different parameters. All other parameter set 𝑟 = 0.6 in other experiments. settings are kept unchanged (unless noted otherwise in the following paragraph). eTh results are recorded in Tables 5– 4.2.3. Harmony Memory Consideration Rate: 𝑅𝐶 . Tables 12 after 100 Monte Carlo runs. Among them, Tables 5, 7, 9, 9 and 10 recorded the results performed in the bench- and 11 show the best minima found by HS/BA algorithm over mark problem with the harmony memory consideration rate 100 Monte Carlo runs. Tables 6, 8, 10,and 12 show the average HMCR = 0, 0.1, 0.2, . . . , 0.9, 1.0, xfi ed loudness 𝐴 = 0.95 , minima found by HS/BA algorithm, averaged over 100 Monte pulse rate 𝑟 = 0.6 ,and pitchadjustmentratePAR =0.1.From Carlo runs. In other words, Tables 5, 7, 9, 11 and Tables 6, Tables 9 and 10,wecan recognizethatthe function values 8, 10, 12 show the best and average performance of HS/BA evaluated by HS/BA are better/smaller on bigger HMCR algorithm, respectively. In each table, the last row is the total (>0.5) than on smaller r (<0.5), and most benchmarks reach number of functions on which HS/BA performs the best with minimum when HMCR is equal or very close to 1. So, we set some parameters. HMCR = 0.95 in other experiments. 4.2.1. Loudness: 𝐴 . Tables 5 and 6 recorded the results 4.2.4. Pitch Adjustment Rate: 𝑅 . Tables 11 and 12 recorded the results performed on the benchmark problems with the performed in the benchmark problem with the loudness 𝐴= pitch adjustment rate PAR = 0, 0.1, 0.2, . ..,0.9,1.0,fixed 0, 0.1, 0.2, . . . , 0.9, 1.0 and xe fi d pulse rate 𝑟 = 0.6 ,harmony memory consideration rate HMCR = 0.95, and pitch adjust- loudness 𝐴 = 0.95 ,pulse rate 𝑟 = 0.6 ,and harmony memory consideration rate HMCR =0.95.FromTables 11 ment rate PAR = 0.1. From Tables 5 and 6,obviously,itcan be and 12, we can recognize that the function values for HS/BA seen that (i) for the three benchmark functions F02, F03, and F04, HS/BA performs slightly differently, that is to say, these vary little with the increasing PAR, and HS/BA reaches optimum/minimum in most benchmarks when PAR is equal three benchmark functions are insensitive to the parameter 𝐴 .(ii) Forbenchmark functionsF01,F06,F11, andF12, or very close to 0.1. So, we set PAR =0.1 in other experiments. HS/BA performs better on smaller 𝐴 (<0.5). (iii) However, for functions F05, F07–10, F13, and F14, HS/BA performs better 4.3. Discussion. In the HS/BA, the bats yfl in the sky using on bigger 𝐴 (>0.5). In sum, HS/BA performs the best when 𝐴 echolocation to find food/prey (i.e., best solutions). Four is equal or very close to 1.0. So, we set 𝐴 = 0.95 ,which is very other parameters are the loudness (𝐴 ), therateofpulse close to 0.9 and 1.0 in other experiments. emission (𝑟 ), harmony memory consideration rate (HMCR), 𝑃𝐴 𝐻𝑀 16 Journal of Applied Mathematics Table 6: Mean normalized optimization results in fourteen benchmark functions with different A. eTh numbers shown are the minimum objective function values found by HS/BA algorithm, averaged over 100 Monte Carlo simulations. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.01 1.01 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F02 1.52 1.00 1.46 1.17 1.08 1.37 1.88 1.65 1.67 1.79 1.48 F03 6.69 10.41 1.06 1.01 1.03 1.03 1.01 1.02 1.05 1.00 1.00 F04 201.31 1.26 4.10 4.21 3.33 2.72 1.00 5.19 2.91 1.00 3.11 F05 591.22 16.82 4.25 5.87 1.01 4.14 24.04 1.00 3.68 7.33 9.00 F06 1.00 4.90E4 637.99 258.71 4.95E4 2.18 2.17 2.17 2.18 2.17 2.17 F07 8.57 2.21E6 6.43 272.34 110.48 2.11E4 1.02 1.01 1.00 1.00 1.00 F08 49.80 1.14E41.72E4 161.60 224.59 91.19 1.74E4 1.03 1.11 1.00 1.00 F09 82.37 1.73E6 1.06 3.14 74.45 103.10 42.26 7.94E3 1.00 1.37 1.29 F10 90.31 1.45E32.45E52.32E3 23.34 22.34 31.38 12.89 2.34E31.40 1.00 F11 4.98 1.00 3.15E4 2.33 14.41 520.23 443.65 616.26 249.91 4.78E42.14 F12 3.69 2.10E4 4.57E4 1.00 2.12E4 266.35 233.13 198.80 276.14 112.01 2.14E4 F13 1.90 8.25 4.48E62.14E6 1.00 6.15 254.51 222.75 189.96 263.86 107.01 F14 66.91 1.95E51.17E41.24E3 21.04 1.86E329.40 23.92 1.00 18.35 24.75 12 1 2 2 1 2 2 4 5 5 Table 7: Best normalized optimization results in fourteen benchmark functions with different 𝑟 .Thenumbers shownare thebestresults found aeft r 100 Monte Carlo simulations of HS/BA algorithm. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.84 1.84 1.84 4.84 1.59 1.71 1.00 1.34 1.09 1.16 1.01 F02 9.44E3 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 3.73 5.46 1.86 1.86 1.86 1.86 1.00 1.72 1.21 1.61 1.01 F04 11.42 1.11 24.29 1.23 1.26 1.00 1.29 1.29 1.29 1.29 1.29 F05 1.77E5 2.30 1.15 1.09E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F06 62.44 29.96 48.58 15.60 26.34 7.80 1.82 1.00 35.20 2.39 1.55 F07 8.31 4.48 2.58 1.29 1.29 3.49 1.00 1.00 1.00 1.00 1.00 F08 9.61 1.18 4.20 1.37 1.37 1.37 6.95 1.01 1.01 1.00 1.00 F09 373.94 444.22 1.68 3.35 1.68 1.68 1.00 291.98 1.01 1.01 1.01 F10 386.93 3.16 13.97 4.63 1.58 1.58 1.00 1.58 309.31 1.58 1.01 F11 42.55 1.00 18.29 21.35 20.18 11.11 19.04 21.35 15.81 18.76 11.55 F12 114.40 1.00 141.84 44.48 125.47 44.48 44.48 44.48 44.48 44.48 69.43 F13 6.39 6.37 6.01 2.96 3.02 1.00 1.52 1.39 1.00 2.57 2.49 F14 120.52 4.04 2.33 1.00 1.16 3.40 1.00 1.00 1.16 1.16 1.16 03 1 2 2 4 8 54 4 4 and pitch adjustment rate (PAR). The appropriate update for searching globally at the same time. It succeeds in doing thepitchadjustmentrate(PAR)andtherateofpulseemission this due to the local search via harmony search algorithm (𝑟 ) balances the exploration and exploitation behavior of each and global search via bat algorithm concurrently. A similar bat, respectively, as the loudness usually decreases once a bat behavior may be performed in the PSO by using multiswarm has found its prey/solution, while the rate of pulse emission from a particle population initially [44]. However, HS/BA’s increases in order to raise the attack accuracy. advantages include performing simply and easily, and only For all of the standard benchmark functions that have having four parameters to regulate. eTh work carried out been considered, the HS/BA has been demonstrated to here demonstrates the HS/BA to be robust over all kinds of performbetterthanorbeequal to thestandardBAand other benchmark functions. acclaimed state-of-the-art population-based algorithms with Benchmark evaluation is a good way for verifying the the HS/BA performing significantly better in some functions. performanceofthe metaheuristicalgorithms, butitalsohas The HS/BA performs excellently and efficiently because of limitations. First, we did not make any special effort to tune its ability to simultaneously carry out a local search, still the optimization algorithms in this section. Different tuning Journal of Applied Mathematics 17 Table 8: Mean normalized optimization results in fourteen benchmark functions with different 𝑟 . eTh numbers shown are the minimum objective function values found by HS/BA algorithm, averaged over 100 Monte Carlo simulations. 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.84 1.99 1.86 1.94 1.59 1.00 1.42 1.34 1.09 1.16 1.71 F02 3.31 5.38 4.57 3.08 3.82 1.14 1.00 2.89 2.79 4.02 3.11 F03 3.73 5.46 5.33 2.29 3.36 2.41 1.00 1.72 1.21 1.61 1.36 F04 11.42 1.11 24.29 1.49E41.13E33.94E32.78E419.66 175.22 1.14 1.00 F05 16.09 128.62 32.69 24.46 64.40 12.77 29.59 76.17 4.56 1.00 4.46 F06 62.44 26.96 48.58 15.60 26.34 7.80 1.82 1.00 35.20 2.39 1.55 F07 3.30 1.78 1.34 1.03 1.23 1.39 1.55 1.00 1.19 1.36 3.49 F08 4.88 3.93 3.94 1.94 2.19 3.83 3.53 4.33 1.00 2.27 3.14 F09 1.39 1.66 1.14 1.00 1.21 1.15 1.15 1.09 1.08 1.04 1.09 F10 7.21 9.94 8.75 3.60 8.46 6.11 4.83 4.14 5.76 1.00 2.71 F11 3.82 4.23 3.73 2.05 1.81 1.00 1.71 2.20 1.42 1.68 1.89 F12 1.64 2.17 2.04 1.47 1.80 1.86 1.00 1.21 1.13 1.34 1.56 F13 6.39 6.37 6.01 2.96 3.02 1.90 1.52 1.39 1.00 2.57 2.49 F14 5.74 15.43 7.42 6.82 6.44 3.17 1.00 4.85 2.21 1.74 2.09 00 0 1 0 2 4 22 2 1 Table 9: Best normalized optimization results in fourteen benchmark functions with different HMCR. The numbers shown are the best results found aer ft 100 Monte Carlo simulations of HS/BA algorithm. HMCR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.64 1.64 1.64 1.64 1.64 1.64 1.64 1.51 1.28 1.00 1.63 F02 1.87E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 19.82 26.49 3.71 3.71 3.71 3.71 3.71 3.71 3.71 2.06 1.00 F04 1.26E5 1.00 1.87E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F05 6.07E55.34 1.00 1.87E4 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F06 108.92 246.31 365.04 345.40 338.57 234.65 143.49 45.28 23.30 1.00 25.75 F07 9.22 14.28 5.34 1.00 1.00 9.30 1.00 1.00 1.00 1.00 1.00 F08 18.10 1.08 7.72 1.08 1.08 1.08 35.94 1.00 1.00 1.00 1.00 F09 280.04 391.61 1.27 6.81 1.27 1.27 1.27 227.95 1.00 1.00 1.00 F10 551.65 8.76 1.76E3 11.70 1.64 1.64 1.64 1.64 274.48 1.00 1.00 F11 14.67 1.00 6.18 6.18 13.99 6.18 6.18 6.18 4.40 2.69 6.16 F12 7.68 1.00 11.10 2.73 10.21 2.73 2.73 2.73 2.73 2.73 4.73 F13 7.72 26.75 15.90 17.76 6.12 10.40 6.12 5.95 2.55 1.00 2.25 F14 537.16 14.28 5.34 1.00 1.00 7.13 1.00 1.00 1.00 1.00 1.00 04 2 4 5 3 5 6 7 11 9 parameter values in the optimization algorithms might result here are promising for HS/BA and indicate that this novel in significant differences in their performance. Second, real- method might be able to nd fi a niche among the plethora of world optimization problems may not have much of a population-based optimization algorithms. relationship to benchmark functions. Third, benchmark tests We note that CPU time is a bottleneck to the implemen- might result in different conclusions if the grading criteria or tation of many population-based optimization algorithms. problem setup change. In our work, we examined the mean If an algorithm cannot converge fast, it will be impractical, and best results obtained with a certain population size and sinceitwould take toolongtofind an optimalorsuboptimal aer ft a certain number of generations. However, we might solution. HS/BA does not seem to require an unreasonable arrive at different conclusions if (for example) we change the amount of computational eo ff rt; of the ten optimization algo- generation limit, or look at how many generations it takes to rithms compared in this paper, HS/BA was the third fastest. reach a certain function value, or if we change the population Nevertheless, ndin fi g mechanisms to accelerate HS/BA could size. In spite of these caveats, the benchmark results shown be an important area for further research. 18 Journal of Applied Mathematics Table 10: Mean normalized optimization results in fourteen benchmark functions with different HMCR. The numbers shown are the best results found aer ft 100 Monte Carlo simulations of HS/BA algorithm. HMCR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.01 1.01 1.01 1.01 1.01 1.01 1.01 1.01 1.01 1.00 1.01 F02 747.56 7.87 5.25 3.43 5.09 7.10 3.07 1.87 1.60 1.00 1.01 F03 9.13 3.12E4 1.09 1.07 1.05 1.06 1.05 1.02 1.01 1.01 1.00 F04 2.10𝐸6 1.00𝐸4 3.95𝐸4 9.48𝐸3 5.97𝐸3 2.95𝐸3 1.51𝐸3 1.50𝐸3 46.94 1.00 2.37 F05 3.24𝐸6 3.27𝐸4 2.09𝐸4 4.14𝐸4 7.13𝐸3 1.54𝐸3 8.93𝐸3 1.53𝐸3 629.48 1.00 203.73 F06 1.00 5.94𝐸4 422.04 632.15 6.00𝐸4 1.92 1.91 1.91 1.91 1.91 1.91 F07 10.63 2.07𝐸6 9.00 216.72 324.55 3.07E4 1.04 1.03 1.02 1.01 1.00 F08 59.50 9.88𝐸3 3.04E4 141.80 217.01 324.68 3.07E4 1.07 1.07 1.03 1.00 F09 226.87 4.95𝐸6 3.08 8.91 114.22 173.60 259.41 2.45E41.45 1.00 1.71 F10 257.37 2.83𝐸4 9.35E51.37E4 97.09 66.56 99.25 149.40 1.39E4 1.00 2.73 F11 6.07 1.00 1.83E4 2.02 16.61 390.40 262.99 403.00 603.61 5.72E41.84 F12 3.00 2.79E43.60E4 1.00 2.82E4 270.95 194.14 130.77 200.40 300.16 2.84E4 F13 2.32 9.66 5.61E61.89E6 1.00 8.15 267.78 191.86 129.23 198.05 296.65 F14 184.66 3.84E51.17E41.85E3 1.00 5.71E3 25.11 55.39 39.60 26.57 41.49 1 1 0 1 2 000 0 63 Table 11: Best normalized optimization results in fourteen benchmar k functions with different PAR. eTh numbers shown are the best results found aeft r 100 Monte Carlo simulations of HS/BA algorithm. PAR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F02 3.91E3 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F03 1.00 4.25 1.51 2.20 2.20 2.12 2.20 1.99 2.20 1.28 1.80 F04 1.00 2.55 65.54 2.55 2.55 2.55 2.55 2.55 2.55 2.55 2.55 F05 2.52 1.00 1.71 1.69E3 1.71 1.71 1.71 1.71 1.71 1.71 1.71 F06 1.00 59.87 34.18 22.85 46.52 23.44 43.54 44.84 36.88 18.99 8.12 F07 8.07 1.00 1.48 2.55 2.55 13.06 2.55 2.55 2.55 2.55 2.55 F08 5.43 1.00 1.92 1.00 1.00 1.00 11.45 1.00 1.00 1.00 1.00 F09 76.72 2.52 1.17 1.00 1.71 1.71 1.71 155.56 1.71 1.71 1.71 F10 929.10 1.48 1.00 4.91 2.55 2.55 2.55 2.22 2.35E3 2.55 2.55 F11 3.18E3 1.00 5.82E33.99E33.39E35.82E3 4.91E35.82E35.82E39.58E35.82E3 F12 305.06 1.00 537.28 97.34 187.58 97.34 97.34 97.34 97.34 97.34 398.27 F13 1.92 4.22 5.87 10.07 12.98 4.66 1.48 4.01 3.17 1.00 4.09 F14 88.12 1.00 1.48 2.55 2.55 4.91 2.55 2.55 2.55 2.55 2.55 4 8 3 4 3 323 3 4 3 5. Conclusion and Future Work if it has better tfi ness. This selection scheme is rather greedy, which oeft n overtakes original HS and BA. The HS/BA This paper proposed a hybrid metaheuristic HS/BA method attempts to take merits of the BA and the HS in order to for optimization problem. We improved the BA by combining avoid all bats getting trapped in inferior local optimal regions. original harmony search (HS) algorithm and evaluating the The HS/BA enables the bats to have more diverse exemplars HS/BA on multimodal numerical optimization problems. to learn from, as the bats are updated each iteration and A novel type of BA model has been presented, and an also form new harmonies to search in a larger search space. improvement is applied to mutate between bats using har- This new method can speed up the global convergence rate mony search algorithmduringthe processofbatsupdating. without losing the strong robustness of the basic BA. From Using the original configuration of the bat algorithm, we the analysis of the experimental results, we observe that the generate the new harmonies based on the newly generated proposed HS/BA makes good use of the information in past bat each iteration aeft r bat’s position has been updated. eTh solutions more eeff ctively to generate better quality solutions new harmony vector substitutes the newly generated bat only frequently, when compared to the other population-based Journal of Applied Mathematics 19 Table 12: Mean normalized optimization results in fourteen benchmar k functions with different PAR. eTh numbers shown are the best results found aeft r 100 Monte Carlo simulations of HS/BA algorithm. PAR 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 F01 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 F02 4.07 1.00 1.00 4.52 2.77 2.57 3.41 5.36 3.69 4.89 1.03 F03 1.00 1.27E4 1.34 1.35 1.35 1.35 1.34 1.35 1.34 1.34 1.34 F04 1.15 81.27 9.39E3 1.00 1.01 1.01 129 1.00 1.00 1.01 1.01 F05 345.73 50.61 86.19 9.07E3 25.01 1.08 1.01 1.91 3.33 1.00 1.18 F06 1.00 5.42E6 4.27E4 4.74E45.48E6 577.88 577.88 577.88 577.88 577.87 577.87 F07 4.86 1.53 1.00 95.43 105.96 1.22E4 1.34 1.36 1.32 1.33 1.35 F08 12.69 76.00 8.76E3 17.03 69.13 76.72 8.85E3 1.10 1.05 1.04 1.00 F09 63.45 230.47 1.82 1.67 12.78 48.60 53.49 6.09E3 1.11 1.30 1.00 F10 133.55 12.51 1.26 1.97E3 11.48 4.64 16.45 18.93 1.99E3 1.00 1.88 F11 63.02 1.00 6.39E3 79.44 59.24 3.96E31.42E35.81E3 6.45E37.45E579.81 F12 3.90 8.81E3 8.90E3 1.00 8.90E3 44.40 47.78 17.25 70.11 77.84 8.99E3 F13 1.00 21.66 2.07E3 6.69 5.80 4.30 271.67 282.46 105.39 429.26 476.62 F14 46.14 1.00 36.10 55.93 1.22 6.40E3 42.15 32.89 34.81 12.72 51.29 44 33 1 1 1 2 2 3 3 optimization algorithms such as ACO, BA, BBO, DE, ES, References GA, HS, PSO, and SGA. Based on the results of the ten [1] G. Wang, L. Guo, A. H. Gandomi et al., “Lev ´ y-flight krill herd approaches on the test problems, we can conclude that the algorithm,” Mathematical Problems in Engineering,vol.2013, HS/BA significantly improves the performances of the HS Article ID 682073, 14 pages, 2013. and the BA on most multimodal and unimodal problems. [2] A.H.Gandomi,X.S.Yang, S. Talatahari,and A. H. Alavi, In this work, 14 benchmark functions are used to evaluate Metaheuristic Applications in Structures and Infrastructures, the performance of our approach; we will test our approach Elsevier, Waltham, Mass, USA, 2013. on more problems, such as the high-dimensional (𝐷≥ [3] D. H. Ackley, “An empirical study of bit vector function 20) CEC 2010 test suit [45] and the real-world problems. optimization,” in Genetic Algorithms and Simulated Annealing, Moreover, we will compare HS/BA with other metaheuristic L. Davis, Ed., pp. 170–204, Pitman, London, UK, 1987. method, such as DLHS [46, 47], SGHS [48], adaptive DE [4] R. Fletcher and M. J. D. Powell, “A rapidly convergent descent [49], CLPSO [50], and DMS-PSO [51]. In addition, we only method for minimization,” The Computer Journal ,vol.6,no. 2, consider the unconstrained function optimization in this pp.163–168,1963. work. Our future work consists on adding the diversity rules [5] A. O. Griewank, “Generalized descent for global optimization,” into HS/BA for constrained optimization problems, such as JournalofOptimizationTheory andApplications ,vol.34, no.1, constrained real-parameter optimization CEC 2010 test suit pp.11–39,1981. [52]. [6] X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made In the eld fi of optimization, there are many issues worthy faster,” IEEE Transactions on Evolutionary Computation,vol.3, of further study, and efficient optimization method should be no.2,pp. 82–102,1999. developed depending on the analysis of specific engineering [7] K.A.DeJong, Analysis of the Behavior of a Class of Genetic problem. Our future work will focus on the two issues: on Adaptive Systems, University of Michigan, 1975. the one hand, we would apply our proposed approach HS/ [8] L. A. Rastrigin, Extremal Control Systems,Nauka,Moscow, Rus- BA to solve practical engineering optimization problems, sian, 1974, Theoretical Foundations of Engineering Cybernetics and, obviously, HS/BA can become a fascinating method for Series. real-world engineering optimization problems; on the other [9] S.-K. S. Fan and J.-M. Chang, “Dynamic multi-swarm particle hand, we would develop new meta-hybrid approach to solve swarm optimizer using parallel PC cluster systems for global optimization problem. optimization of large-scale multimodal functions,” Engineering Optimization,vol.42, no.5,pp. 431–451, 2010. [10] W. Gong, Z. Cai, and C. X. Ling, “DE/BBO: a hybrid differential Acknowledgments evolution with biogeography-based optimization for global numerical optimization,” Soft Computing ,vol.15, no.4,pp. 645– This work was supported by State Key Laboratory of Laser 665, 2010. Interaction with Material Research Fund under Grant no. [11] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “A modified SKLLIM0902-01 and Key Research Technology of Electric- firefly algorithm for UCAV path planning,” International Jour- discharge Non-chain Pulsed DF Laser under Grant no. LXJJ- nalofHybridInformation Technology,vol.5,no. 3, pp.123–144, 11-Q80. 2012. 20 Journal of Applied Mathematics [12] G.Wang,L.Guo,H.Duan,L.Liu,andH.Wang,“Ahybridmeta- [30] X. S. Yang and A. H. Gandomi, “Bat algorithm: a novel approach heuristic DE/CS algorithm for UCAV path planning,” Journal of for global engineering optimization,” Engineering Computa- Information and Computational Science,vol.9,no. 16,pp. 1–8, tions,vol.29, no.5,pp. 464–483, 2012. [31] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “A Bat algo- [13] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, and M. Shao, “Path rithm with mutation for UCAV path planning,” The Scientific planning for uninhabited combat aerial vehicle using hybrid World Journal,vol.2012, ArticleID418946, 15 pages, 2012. meta-heuristic DE/BBO algorithm,” Advanced Science, Engi- [32] R. Parpinelli and H. Lopes, “New inspirations in swarm intelli- neering and Medicine,vol.4,no. 6, pp.550–564,2012. gence: a survey,” International Journal of Bio-Inspired Computa- [14] G. Wang,L.Guo,H.Duan, H. Wang,L.Liu,and M. Shao,“A tion, vol. 3, no. 1, pp. 1–16, 2011. hybrid meta-heuristic DE/CS algorithm for UCAV three- [33] P. W. Tsai,J.S.Pan,B.Y.Liao, M. J. Tsai,and V. Istanda, “Bat dimension path planning,” The Scientific World Journal ,vol. algorithm inspired algorithm for solving numerical optimiza- 2012, Article ID 583973, 11 pages, 2012. tion problems,” AppliedMechanics andMaterials,vol.148,pp. [15] H.Duan, W. Zhao,G.Wang, andX.Feng, “Test-sheetcomposi- 134–137, 2012. tion using analytic hierarchy process and hybrid metaheuristic [34] Wikipedia, “Mathematical optimization,” http://en.wikipedia algorithm TS/BBO,” Mathematical Problems in Engineering,vol. .org/wiki/Numerical optimization. 2012,Article ID 712752,22pages,2012. [35] W. Y. Zhang, S. Xu, and S. J. Li, “Necessary conditions for weak [16] G. Wang, L. Guo, H. Duan, L. Liu, and H. Wang, “Dynamic sharp minima in cone-constrained optimization problems,” deployment of wireless sensor networks by biogeography based Abstract and Applied Analysis, vol. 2011, Article ID 909520, 11 optimization algorithm,” Journal of Sensor and Actuator Net- pages, 2012. works,vol.1,no. 2, pp.86–96,2012. [36] Wikipedia, “Global optimization,” http://en.wikipedia.org/ [17] X. S. Yang,A.H.Gandomi,S.Talatahari, andA.H.Alavi, wiki/Global optimization. Metaheuristics in Water, Geotechnical and Transport Engineer- [37] S. Kirkpatrick, C. D. Gelatt Jr., and M. P. Vecchi, “Optimization ing, Elsevier, Waltham, Mass, USA, 2013. by simulated annealing,” Science,vol.220,no.4598,pp.671–680, [18] D. E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, New York, NY, USA, 1998. [38] M. Dorigo and T. Stutzle, Ant Colony Optimization, eTh MIT [19] R. Storn and K. Price, “Differential evolution—a simple and Press, Cambridge, Mass, USA, 2004. efficient heuristic for global optimization over continuous [39] M. Dorigo, Optimization, Learning and Natural Algorithms, spaces,” JournalofGlobalOptimization,vol.11, no.4,pp. 341– Politecnico di Milano, Italy, 1992. 359, 1997. [40] H. G. Beyer, The Theory of Evolution Strategies , Springer, New [20] S. Das and P. N. Suganthan, “Differential evolution: a survey of York, NY, USA, 2001. the state-of-the-art,” IEEE Transactions on Evolutionary Com- [41] W. Khatib and P. Fleming, “eTh stud GA: a mini revolution?” in putation,vol.15, no.1,pp. 4–31,2011. Parallel Problem Solving From Nature, pp. 683–691, 1998. [21] J. Kennedy and R. Eberhart, “Particle swarm optimization,” [42] J. Derrac, S. Garc´ıa,D.Molina, andF.Herrera,“Apractical in Proceedings of the IEEE International Conference on Neural tutorial on the use of nonparametric statistical tests as a Networks, pp. 1942–1948, Perth, Australia, December 1995. methodology for comparing evolutionary and swarm intelli- [22] A. H. Gandomi, G. J. Yun, X.-S. Yang, and S. Talatahari, “Chaos- gence algorithms,” Swarm and Evolutionary Computation,vol. enhanced accelerated particle swarm optimization,” Communi- 1, no. 1, pp. 3–18, 2011. cations in Nonlinear Science and Numerical Simulation,vol.18, no. 2, pp. 327–340, 2013. [43] M. Friedman, “A comparison of alternative tests of significance for the problem of 𝑚 rankings,” The Annals of Mathematical [23] S. Talatahari, X. S. Yang, R. Sheikholeslami, and A. H. Gandomi, Statistics,vol.11, no.1,pp. 86–92, 1940. “Optimal parameter estimation for muskingum model using hybrid CSS and PSO method,” Journal of Computational and [44] R. Brits, A. P. Engelbrecht, and F. van den Bergh, “Locating Theoretical Nanoscience .Inpress. multiple optima using particle swarm optimization,” Applied [24] D. Simon, “Biogeography-based optimization,” IEEE Transac- Mathematics and Computation,vol.189,no. 2, pp.1859–1883, tions on Evolutionary Computation,vol.12, no.6,pp. 702–713, [45] K. Tang, X. Li, P. N. Suganthan, Z. Yang, and T. Weise, “Bench- [25] G. Wang, L. Guo, H. Duan, L. Liu, H. Wang, and M. Shao, mark functions for the CEC’2010 special session and competi- tion on large scale global optimization,” Nature Inspired Com- “Hybridizing harmony search with biogeography based opti- mization for global numerical optimization,” JournalofCom- putation and Applications Laboratory, USTC, Hefei, China, putational and eor Th etical Nanoscience .Inpress. 2010. [26] Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic [46] Q.-K.Pan,P.N.Suganthan,J.J.Liang,and M. F. Tasgetiren,“A optimization algorithm: harmony search,” Simulation,vol.76, local-best harmony search algorithm with dynamic subpopula- no. 2, pp. 60–68, 2001. tions,” Engineering Optimization,vol.42, no.2,pp. 101–117, 2010. [27] G. Wang,L.Guo,H.Duan, H. Wang,L.Liu,and J. Li,“Incor- [47] Q.-K.Pan,P.N.Suganthan,J.J.Liang,and M. F. Tasgetiren, porating mutation scheme into krill herd algorithm for global “A local-best harmony search algorithm with dynamic sub- numerical optimization,” Neural Computing and Applications, harmony memories for lot-streaming flow shop scheduling problem,” Expert Systems with Applications,vol.38, no.4,pp. 3252–3259, 2011. [28] X. S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, Frome, UK, 2nd edition, 2010. [48] Q.-K.Pan,P.N.Suganthan,M.F.Tasgetiren, andJ.J.Liang, [29] A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari, “Bat “A self-adaptive global best harmony search algorithm for algorithm for constrained optimization tasks,” Neural Comput- continuous optimization problems,” Applied Mathematics and ing & Applications.Inpress. Computation,vol.216,no. 3, pp.830–848,2010. Journal of Applied Mathematics 21 [49] S. M. Islam, S. Das, S. Ghosh, S. Roy, and P. N. Suganthan, “An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization,” IEEE Transactions on Systems, Man, and Cybernetics B,vol.42, no. 2, pp. 482–500, 2012. [50] J. J. Liang, A. K. Qin, P. N. Suganthan, and S. Baskar, “Compre- hensive learning particle swarm optimizer for global optimiza- tion of multimodal functions,” IEEE Transactions on Evolution- ary Computation,vol.10, no.3,pp. 281–295, 2006. [51] S. Z. Zhao, P. N. Suganthan, Q.-K. Pan, and M. Fatih Tasge- tiren, “Dynamic multi-swarm particle swarm optimizer with harmony search,” Expert Systems with Applications,vol.38, no. 4, pp. 3735–3742, 2011. [52] R. Mallipeddi and P. Suganthan, Problem Definitions and Eval- uation Criteria for the CEC 2010 Competition on Constrained Real-Parameter Optimization, Nanyang Technological Univer- sity,Singapore,2010. Advances in Advances in Journal of Journal of Operations Research Decision Sciences Applied Mathematics Algebra Probability and Statistics Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 The Scientific International Journal of World Journal Die ff rential Equations Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 Submit your manuscripts at http://www.hindawi.com International Journal of Advances in Combinatorics Mathematical Physics Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 Journal of Journal of Abstract and Discrete Dynamics in Mathematical Problems Complex Analysis Mathematics in Engineering Applied Analysis Nature and Society Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 International Journal of Journal of Mathematics and Discrete Mathematics Mathematical Sciences Journal of International Journal of Journal of Function Spaces Stochastic Analysis Optimization Hindawi Publishing Corporation Hindawi Publishing Corporation Volume 2014 Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

Journal

Journal of Applied MathematicsUnpaywall

Published: Jan 1, 2013

There are no references for this article.