Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

Chaotic grey wolf optimization algorithm for constrained optimization problems

Chaotic grey wolf optimization algorithm for constrained optimization problems Abstract The Grey Wolf Optimizer (GWO) algorithm is a novel meta-heuristic, inspired from the social hunting behavior of grey wolves. This paper introduces the chaos theory into the GWO algorithm with the aim of accelerating its global convergence speed. Firstly, detailed studies are carried out on thirteen standard constrained benchmark problems with ten different chaotic maps to find out the most efficient one. Then, the chaotic GWO is compared with the traditional GWO and some other popular meta-heuristics viz. Firefly Algorithm, Flower Pollination Algorithm and Particle Swarm Optimization algorithm. The performance of the CGWO algorithm is also validated using five constrained engineering design problems. The results showed that with an appropriate chaotic map, CGWO can clearly outperform standard GWO, with very good performance in comparison with other algorithms and in application to constrained optimization problems. Highlights Chaos has been introduced to the GWO to develop Chaotic GWO for global optimization. Ten chaotic maps have been investigated to tune the key parameter ‘a’, of GWO. Effectiveness of the algorithm is tested on many constrained benchmark functions. Results show CGWO's better performance over other nature-inspired optimization methods. The proposed CGWO is also used for some engineering design applications. Graphical Abstract Open in new tabDownload slide Chaotic grey wolf optimization, Firefly algorithm, Flower pollination algorithm, Particle swarm optimization algorithm 1. Introduction Constraints represent a feasible region which is nonempty and is filled with some restrictions or constraints to be followed by the solutions to solve a specific optimization problem (Karaboga and Akay, 2011). In general terms, constraints can be classified into equality constraints and inequality constraints which are represented in the form of mathematical equality and inequality equations respectively. Both types of constraints need to be satisfied by the problem's decision variables. Earlier, some deterministic methods like feasible direction approach and generalized gradient descent method were developed for solving constraint problems (Herskovits, 1986). However, due to their limited applicability and complexity of constraints, these were not effective for real world applications like structural optimization problems, economical optimization, location problems and engineering design problems like spring design, welded beam design, truss design, speed reducer design which involve many difficult equality and inequality constraints to be satisfied (Cagnina et al., 2008; Coello, 2000; Gandomi et al., 2013; Gao et al., 2010; Lee and Geem, 2004; Parsopoulos and Vrahatis, 2002). More and more meta-heuristic algorithms have been proposed to tackle these tough constrained optimization problems. These algorithms aim for tolerable velocity of convergence, a better precision, robustness, and performance. Some of the recent meta-heuristic algorithms proposed are Firefly Algorithm (FA) which is inspired by the flashing and attraction behavior of fireflies (Arora and Singh, 2013; Wang et al., 2014), Flower Pollination Algorithm (FPA) which is based on the characteristics of flowering of plants (Yang, 2012), Particle Swarm Optimization (PSO) which is inspired by the swarm behavior such as fish and bird schooling in nature (Shi and Eberhart, 1998), Bird Swarm Algorithm (BSA) which is based on the unique social interactions of bird swarms (Meng et al., 2015), ebb-tide-fish-inspired (ETFI) algorithm which is a simulation of fascinating characteristic of fish's perception of flow, sound and vibrations of tides in water (Meng et al., 2016), Jaya algorithm in which the main concept is to move the solution found so far towards the best solution and away from the worst solution (Rao, 2016), Grey Wolf Optimization (GWO) algorithm which is based on the social hunting behavior of grey wolves, Animal Migration Optimization (AMO) algorithm whose optimization process is mainly divided into two process viz. migration process and updating process with respect to animals (Luo et al., 2016), Butterfly Optimization Algorithm (BOA) which is inspired by the food foraging behavior of butterflies (Arora and Singh, 2015), Brain Storm Optimization (BSO) algorithm which is based on the simulation of brain storming process in humans (Shi, 2015), Whale Optimization Algorithm (WOA) which is inspired from the social interaction of humpback whales (Mirjalili and Lewis, 2016), Crow Search Algorithm (CSA) which mimics the clever characteristic of crows (Askarzadeh, 2016). Such meta-heuristics are being used extensively to solve complex problems like optimal wind generator design problem (Gao et al., 2010), formulation of soil classification (Alavi et al., 2010), prediction of ground soil parameters (Alavi and Gandomi, 2011). Some of the prominent metaheuristics of the literature which have already been used to tackle constrained problems are: Deb introduced a method to handle constraints using GA (Deb, 2000), Montes employed Differential Evolution (DE) Algorithm on constraint handling problems (Mezura-Montes and Coello, 2005), PSO was used by Cagnina to solve constrained optimization problems (Cagnina et al., 2008) and Karaboga used Artificial Bee Colony (ABC) algorithm to handle constraint mechanism (Karaboga and Basturk, 2007). GWO algorithm in fact, is a new meta-heuristic algorithm, inspired by the leadership behavior and unique mechanism of hunting of grey wolves. This population based meta-heuristic has the ability to avoid local optima stagnation to some extent (Yang et al., 2012). It also has good convergence ability towards the optima. In general, GWO advances itself strongly to exploitation. However, it cannot always implement global search well. Thus, in some cases, GWO fails to find global optimal solution. The search strategy used in basic GWO is mainly based on random walks. Thus, it cannot always deal with the problem successfully. With the development of the nonlinear dynamics, chaos theory has been widely used in several applications (Pecora and Carroll, 1990). In this context, one of the most famous applications is the introduction of chaos theory into the optimization methods (Yang et al., 2007). Up to now, the chaos theory has been successfully combined with several meta-heuristic optimization methods (Gandomi et al., 2013). Some major efforts in this area includes PSO (Gandomi et al., 2013), FA (Gandomi et al., 2013), BOA (Arora and Singh, 2017), GA (Han and Chang, 2013), hybridizing chaotic sequences with memetic differential evolution algorithm (Jia et al., 2011), imperialist competitive algorithm (Talatahari et al., 2012) and gravitational search algorithm (Han and Chang, 2012), Krill Herd (KH) algorithm (Wang et al., 2014), and Accelerated Particle Swarm Optimization (APSO) (Gandomi et al., 2013). In the present study, chaotic GWO (CGWO) algorithm is presented for the purpose of accelerating the convergence of GWO. Various one-dimensional chaotic maps are employed in place of the critical parameters used in GWO. Moreover, in order to examine the efficiency of the proposed CGWO in the room of constraint handling mechanism, it has been applied on some constrained benchmark functions and various classical engineering design problems viz. spring design problem, gear train design problem, welded beam design problem, pressure vessel design problem and closed coil helical spring design problem. The results of proposed CGWO on all the constrained benchmark functions have been compared with those obtained by GWO (Mirjalili et al., 2014), Firefly Algorithm (FA) (Yang and algorithm, 2010), Flower Pollination Algorithm (FPA) (Yang, 2012) and Particle Swarm Optimization (PSO) (Kennedy, 2011). On the other hand, the simulation results of all the classical engineering design problems have been compared with other state-of-art meta-heuristics discussed in the respective section. Organization of the remaining paper is as follows: Section 2: a brief introduction of GWO algorithm is given. Section 3: Description of the proposed CGWO algorithm is given in detail. Section 4: Validation of CGWO algorithm on thirteen constrained benchmark functions is performed. Section 5: Experimental study and discussion on results is done. Section 6: CGWO on various classical engineering design problems is described. Section 7: Conclusion of work along with its future scope is given. 2. Overview of grey wolf optimization algorithm Initially, Grey Wolf Optimizer (GWO) was introduced by S. Mirjalilli in year 2014 (Mirjalili et al., 2014). This algorithm is a simulation of unique hunting and searching prey characteristics of grey wolves. GWO has assumed the four level social hierarchy of grey wolves involving α at first, β at second, δ at third and ω wolves at last level. α wolves are the leader wolves managing and conducting the whole pack of grey wolves. It is also responsible for controlling the whole hunting process, taking all types of decisions like hunting, maintaining discipline, sleeping and waking time for whole pack. β wolf which is the best candidate to be the α ⁠, takes feedback from other wolves and give it to the α leader. The third level of grey wolves, i.e. δ wolves, dominate the wolves of forth and the last level called the ω wolves which are responsible for maintaining the safety and integrity in the wolf pack (Mirjalili et al., 2014). The distances from α ⁠, β and δ wolves i.e. Dα ⁠, Dβ and Dδ to each of the remaining wolf (X→) are calculated using Eq. (1) using which the effect of α ⁠, β and δ wolves on the prey viz. X1→ ⁠, X2→ and X3→ can be calculated as represented in Eq. (2). Dα→=C1→·Xα→-X→,Dβ→=C2→·Xβ→-X→,Dδ→=C3→·Xδ→-X→(1) X1→=Xα→-A1→·Dα→,X2→=Xβ→-A2→·Dβ→,X3→=Xδ→-A3→·Dδ→(2) A→=2a→·r1→-a→,C→=2·r2→(3) X→(t+1)=X1→+X2→+X3→3(4) The values of controlling parameters of the algorithm which are a ⁠, A and C are calculated using Eq. (3). Here, r1→ and r2→ are the random vectors in the range of [0,1] ⁠. These vectors make wolves able to reach at any point between the prey and the wolf. Vector a→ is involved in controlling activity of the GWO algorithm and used in calculating A→ ⁠. The component values of a→ vector decreases linearly from 2 to 0 over the courses of iterations (Mirjalili et al., 2014). C→ helps in putting some extra weight on the prey to make it difficult for the wolves to find it. Thus at last, all other wolves update their positions X→(t+1) using Eq. (4). In spite of being new comer, GWO is being used in many real world applications such as a modified version of GWO algorithm was proposed and applied successfully for training q-gaussian radial basis functional link nets (Muangkote et al., 2014), A modified GWO algorithm named multi-verse optimizer (MVO) for solving various optimization problems was proposed (Mirjalili et al., 2016), The binary version of GWO algorithm was proposed to be used for feature selection which was one of the important and crucial modification of GWO algorithm (Emary et al., 2016), A multi-objective GWO was modeled to minimize the gases emission level of CO2 by the capacitor in which 30-bus system was used for the evaluation of the proposed method (Mohamed et al., 2015), GWO algorithm was used to optimize the controlling parameters of DC motor (Madadi and Motlagh, 2014), The flowshop scheduling problem of stage 2 was solved along with the optimization of its release time by using GWO algorithm (Komaki and Kayvanfar, 2015). 3. Chaotic grey wolf optimization algorithm In spite of having good convergence rate, GWO still cannot always perform that well in finding global optima which affect the convergence rate of the algorithm. So, to reduce this affect and improve its efficiency, CGWO algorithm is developed by introducing chaos in GWO algorithm itself. In general terms, chaos is a deterministic, random- like method found in non-linear, dynamical system, which is non-period, non-converging and bounded. Mathematically, chaos is randomness of a simple deterministic dynamical system and chaotic system may be considered as sources of randomness. In order to introduce chaos in optimization algorithms, different chaotic maps having different mathematical equations are used. Since last decade, chaotic maps have been widely appreciated in the field of optimization due to their dynamic behavior which help optimization algorithms in exploring the search space more dynamically and globally. At a recent time, in accordance with different human's realm a wide variety of chaotic maps designed by physicians, researchers and mathematicians are available in the optimization field (Table 1. Table 1 Details of chaotic maps applied on CGWO. S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 a Using a = 0.5 and b = 0.2, it generates chaotic sequence in (0, 1), a = control parameter, xk = chaotic number at iteration ‘k’. Open in new tab Table 1 Details of chaotic maps applied on CGWO. S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 a Using a = 0.5 and b = 0.2, it generates chaotic sequence in (0, 1), a = control parameter, xk = chaotic number at iteration ‘k’. Open in new tab In these chaotic maps, any number in the range [0, 1] (or according to the range of chaotic map) can be chosen as the initial value. However, it should be noted that the initial value may have significant impacts on the fluctuation pattern of some of the chaotic maps. This set of chaotic maps has been chosen with different behaviors, while the initial value is 0.7 for all (Fig. 2. Fig. 1. Open in new tabDownload slide Flowchart of optimization procedure of CGWO. Fig. 2. Open in new tabDownload slide Pseudocode of proposed CGWO algorithm. The optimization procedure of the proposed CGWO algorithm is also presented in the form of flow chart given in Fig. 1. In this, first step involves the stochastic initialization of population of grey wolves. Then, a chaotic map is chosen to be mapped with the algorithm along with the initialization of its first chaotic number and a variable (Gandomi and Yang, 2014). Sequentially, the parameters of the CGWO algorithm involved in conducting the exploration - exploitation mechanism viz. a, A and C are initialized which are same as in GWO. Fitness of all grey wolves initialized in the search space are evaluated using various standard benchmark functions and are sorted according to their fitness. The first wolf got after sorting is assumed to be α wolf and accordingly second and third wolves are assumed as β and δ wolf respectively. Sequentially, the fitter wolf will keep updating its position using Eq. (4) and may get the position of α wolf as optimal solution. The parameters' values are also updated along with the course of iterations using Eq. (3). At the end of the last iteration, fitness of α wolf will be considered as the most optimal solution of the problem found by the CGWO algorithm. 4. CGWO for constrained benchmark functions All the constrained problems are formulated in the form of two functions i.e. objective function and constraint violation function (Powell, 1978). Objective function is the function whose main aim is to find the optimal solution say ‘x’ in the specified search space. It can be represented as Eq. (5). minimizef(x),x=(x1,x2,x3,…,xn)∈Rn(5) where n is the number of dimensions contained in a solution. x∈F∈S where F is the feasible region in the search space S which defines a n-dimensional rectangle R (Karaboga and Basturk, 2007). This rectangle R has domains size in the form of lower bound (lb) and upper bound (ub) as represented in Eq. (6). lb(i)⩽x(i)⩽ub(i),1⩽i⩽n(6) and the number of constraints say ‘ m(m>0) ’ are defined in the F space is the form of Eq. (7). gj(x)⩽0,forj=1,…,q,hj(x)=0,forj=q+1,…,m(7) Here, gj(x) and hj(x) are called as inequality and equality constraints respectively. If any solution say ‘x’ satisfies the constraint gk or hk in F space, then gk is considered to be an active constraint at x. 5. Experimental study and discussion 5.1. Parameter settings Among all the complex methods to calculate the penalty of constraints like iterative method, methods based on feasibility of solutions, simple penalty function method is used in all the constrained optimization problems implemented and discussed in this paper (Joines and Houck, 1994). The population size of grey wolves is taken 30 and 100 iterations are performed for the results of all the constrained benchmark functions. 30 Monte Carlo runs are executed on each of the constrained benchmark functions. For effective validation of the proposed CGWO algorithm in case of constrained benchmark functions, it has been compared with some other optimization algorithms which are GWO (Mirjalili et al., 2014), FA (Yang and algorithm, 2010), FPA (Yang, 2012) and PSO (Kennedy, 2011). Additionally, parameter settings for all these algorithms need to be done for impartial comparison which is one of the difficult task to perform during the execution. The parameter settings done in this work is like for PSO, global learning (no local neighborhoods), an inertial constant = 0.3, a cognitive constant = 1 and a social constant for swarm interaction = 1 is used. For FPA, λ=1.5 for Levy distribution function and proximity probability p = 0.8 is used. For FA, randomization parameter α = 0.6, attractiveness β0 = 1 and absorption coefficient γ=1.0 is used. For GWO, two random vectors r1→ and r2→ are taken in the range of (0, 1) and the controlling parameter a→ has linearly decreasing values from 2 to 0 over the course of iterations. For CGWO, values of two vectors r1→ and r2→ are taken randomly in the range of (0, 1) and the controlling parameter a→ has linearly decreasing values from 2 to 0 over the course of iterations, chaotic function variables a = 0.5 and b = 0.2 is used. Additionally, the best map performer for constrained optimization problems as per results, i.e., chebyshev map has been used on CGWO constraint handling mechanism. CGWO is implemented in C++ and compiled using Qt Creator 2.4.1 (MinGW) under Microsoft Windows 7 operating system. All simulations are carried out on a computer with an Intel(R) Core(TM) i5-3210@2.50 Ghz CPU. 5.2. Results and discussion In order to evaluate the capability of proposed CGWO for handling constrained problems, a set of thirteen widely used constrained benchmark functions have been used (Table 1. All the problems consist in the set have various linear, non-linear and quadratic equations in the form of equality and inequality constraints which are represented in Table 2. To choose the best possible map for all the constrained optimization problems, all the selected maps are applied on all the constrained benchmark functions whose results are provided in Table 3. According to the results, chebyshev map showed promising performance by outperforming seven out of thirteen constrained benchmark functions and thus is chosen for further investigation of CGWO on constrained optimization problems. Table 2 Details of constrained benchmark functions. Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Open in new tab Table 2 Details of constrained benchmark functions. Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Open in new tab Table 3 Results of 10 chaotic maps on all constrained benchmark functions on CGWO. Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Open in new tab Table 3 Results of 10 chaotic maps on all constrained benchmark functions on CGWO. Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Open in new tab The results of all the constrained benchmark functions applied on CGWO and other algorithms are given in Table 4. It can be easily seen from the results that CGWO has handle seven out of thirteen constrained benchmark functions very efficiently and thus has outperformed all other algorithms for these seven benchmark functions. PSO handled three constrained functions well among all. GWO performed superior to other for two constrained functions. FA outperformed for only one constrained function. The reason was that the chaotic maps with a unimode centered around the middle of maps, tend to produce better results, and chebyshev map fall into this category and it is indeed very effective. The results have also revealed the significant improvement of the proposed CGWO algorithm with the application of deterministic chaotic signals in place of constant value. Table 4 Comparison results of all constrained benchmark functions. Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Open in new tab Table 4 Comparison results of all constrained benchmark functions. Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Open in new tab 5.3. Graphical analysis For further effective evaluation of performance of all the algorithms, graphical analysis has also been done. The line graphs of convergence of various constrained benchmark functions using CGWO algorithm and other algorithms viz. GWO, FA, FPA and PSO have been shown from Figs. 3–6 which help to analyze the convergence rate of each algorithm more effectively. All these graphs have been taken on 100 iterations to clearly notice and analyze the convergence of all the algorithms. Fig. 3. Open in new tabDownload slide Comparison of five optimization algorithms for the G1 constrained benchmark function in 100 iterations. Fig. 4. Open in new tabDownload slide Comparison of five optimization algorithms for the G2 constrained benchmark function in 100 iterations. Fig. 5. Open in new tabDownload slide Comparison of five optimization algorithms for the G9 constrained benchmark function in 100 iterations. Fig. 6. Open in new tabDownload slide Comparison of five optimization algorithms for the G13 constrained benchmark function in 100 iterations. Fig. 3 shows the line graphs of convergence of all the five optimization algorithms applied on G1 test constrained benchmark function. From the graph, it can be seen that CGWO has the best performance for this benchmark function. It is showing superior performance of CGWO by reaching the optima for this test function within 10 iterations only. Further, it can be concluded from this graph that the GWO and FPA performed well when compared with other algorithms. FA demonstrates poor convergence in most of the optimization process, however it eventually ends the value of PSO. Fig. 4 demonstrates the line graphs of convergence of CGWO along with all other algorithms for G2 constrained benchmark function in which it is easily remarkable that CGWO is fastest of all in context of convergence towards the optima than that of FA, FPA, GWO and PSO. PSO has shown a competitive performance to CGWO for this problem and exhibits significant performance in terms of convergence speed. GWO, FA and FPA show a faster convergence rate initially, however they seem to be trapped into sub-optimal values as the optimization procedure proceeds. Fig. 5 illustrates the convergence rate on G9 test constrained benchmark function in which CGWO is demonstrating the high rate of convergence as compared to FPA, FA, PSO and GWO. The convergence line graph of FPA and PSO is showing slow convergence by giving constant fitness values for many iterations in between the 100 iterations as they seem to be trapped into sub-optimal values as the procedure proceeds, especially FPA. This demonstrates how CGWO is capable of balancing exploration and exploitation to find the global optimum rapidly and effectively. Fig. 6 is presenting the graphical view of convergence of all algorithms on G13 test constrained benchmark function in which it can be clearly seen that CGWO algorithm is nearest to the global optima of this constrained problem among algorithms viz. FPA, FA, GWO and PSO and it also shows fastest convergence of all. GWO and FPA illustrate poorer convergence than the other algorithms in the initial iterations. However, the search process is progressively accelerated during iterations for these algorithms. This indicates that the performance of CGWO can be boosted by the chaotic maps in terms of not only exploration but also exploitation. 5.4. Statistical testing Statistical testing is a process of making quantitative decisions about a problem in which statistical data set is evaluated and taken which is then compared hypothetically (Wilcoxon et al., 1970). The statistical testing of the constrained benchmark functions applied on all algorithms involved in this paper has been done using a widely used non parametric test named Wilcoxon signed rank-test discussed in Section 5.4.1. 5.4.1. Wilcoxon signed rank-test Wilcoxon signed rank-test is a statistical method which is solely based on the order of the sample's observations (Table 5 and their rank summary is provided in Table 6. Results depict that CGWO possessed lowest rank among all other optimization algorithms used in comparison for most of the benchmark functions which proves the superior performance of CGWO among other in comparison. However, PSO and GWO competed with CGWO closely and ranked second and third respectively. The superior performance of CGWO doesn't mean that it is superior than all other optimization algorithms present in the literature which will also lead to the violation of ‘free lunch theorem’ (Ho and Pepyne, 2002). Its performance simply signifies that it is better than other algorithms taken in this work only. Table 5 Pair-wise Wilcoxon signed rank test results. Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Open in new tab Table 5 Pair-wise Wilcoxon signed rank test results. Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Open in new tab Table 6 Rank summary of statistical assessment results. Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Open in new tab Table 6 Rank summary of statistical assessment results. Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Open in new tab 6. CGWO for classical engineering design problems Engineering design is a process of satisfying the needs involved in building a product. It is a decision making process which consists of a complex objective function and a large number of decision variables such as weight, strength and wear (Askarzadeh, 2016). Meta-heuristic methods come into being as an alternative to the traditional optimization methods. With their merits of finding acceptable solutions in an affordable time and being tolerant of non-convex and non-differentiable, meta-heuristic algorithms have attracted great research interest during the recent years. In real design problems the number of design variables can be very large, and their influence on the objective function to be optimized can be very complicated, with a nonlinear character. Therefore, in this paper, design considerations of five classical engineering design problems viz. spring design problem, gear train design problem, welded beam design problem, pressure vessel design problem and closed coil helical spring design problem have been done in Sections 6.1, 6.2, 6.3, 6.4, 6.5. These problems contain various local optima, whereas only global optimum is required. These problems cannot be handled by traditional methods which focus on local optima only. Hence, there is a need for effective and efficient optimization methods for these engineering design problems. In this section, various experiments on these benchmark problems are implemented to verify the performance of the proposed meta-heuristic CGWO method. In order to get an unbiased comparison of CPU times, all the experiments are performed over 30 independent runs for 500 iterations. 6.1. Tension/Compression spring design problem The main goal of this engineering design problem is to minimize the weight of the spring involving three decision variables which are wire diameter (d) ⁠, mean coil diameter (D) and number of active coils (N) ⁠. This problem is subjected to three inequality constraints and an objective function given in Eq. (8). Considerx→=[x1x2x3]=[dDN],Minimizef(x→)=(x3+2)x2x12,Subject tog1(x→)=1-x23x3717,854x14⩽0,g2(x→)=4x22-x1x212,566(x2x13-x14)+15108x12⩽0,g3(x→)=1-140.45x1x22x3⩽0,g4(x→)=x1+x21.5-1⩽0,(8) Variable range0.05⩽x1⩽2.00,0.25⩽x2⩽1.30,2.00⩽x3⩽15.00 Fig. 7. Fig. 7. Open in new tabDownload slide Structure of tension/spring design (Rao et al., 2011). 6.2. Gear train design problem The goal of this engineering design problem is to minimize the cost of gear ratio of the gear train whose schematic diagram is shown in Fig. 8. This problem has no equality or inequality constraints except a boundary constraint. It consists of four decision variables represented as nA(x1) ⁠, nB(x2) ⁠, nD(x3) ⁠, nF(x4) using which the gear ratio can be formulated as nBnD/nFnA ⁠. Mathematical formulation of the objective function of gear train design problem along with its boundary constraint is given in Eq. (9). Min.f(x)=((1/6.931)-(x3x2/x1x4))2S.t.12⩽xi⩽60(9) Fig. 8. Open in new tabDownload slide Structure of gear train design (Rao et al., 2011). Table 9 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 10 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm with conventional GWO algorithm and with those found by other optimization algorithms. In terms of optimal results, CGWO outperforms GWO (Mirjalili et al., 2014), CSA (Askarzadeh, 2016), UPSO (Parsopoulos and Vrahatis, 2005), ABC (Akay and Karaboga, 2012) and MBA (Sadollah et al., 2013). In terms of mean, CGWO gives better value than those obtained by all other algorithms in comparison. Table 7 Optimal solution of spring design problem by CGWO algorithm. Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Open in new tab Table 7 Optimal solution of spring design problem by CGWO algorithm. Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Open in new tab Table 8 Comparison results of spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 N.A. – Not Available. Open in new tab Table 8 Comparison results of spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 N.A. – Not Available. Open in new tab Table 9 Optimal solution of gear train design problem by CGWO algorithm. Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Open in new tab Table 9 Optimal solution of gear train design problem by CGWO algorithm. Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Open in new tab Table 10 Comparison results of gear train design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 N.A. – Not Available. Open in new tab Table 10 Comparison results of gear train design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 N.A. – Not Available. Open in new tab 6.3. Welded beam design problem Welded beam design problem which is a minimization problem has four variables namely weld thickness (h) ⁠, length of bar attached to the weld (l) ⁠, bar's height (t) ⁠, bar's thickness (b) as shown in Fig. 9. The constraints included in this problem are bending stress (⁠ θ ⁠), beam deflection (⁠ δ ⁠), shear stress (⁠ τ ⁠), buckling load (⁠ Pc ⁠) and other side constraints. The mathematical formulas related to this problem are represented in Eq. (10). Considerx→=[x1x2x3x4]=[hltb],Minimizef(x→)=1.10471x12x2+0.04811x3x4(14.0+x2),Subject tog1(x→)=τ(x→)-τmax⩽0,g2(x→)=σ(x→)-σmax⩽0,g3(x→)=δ(x→)-δmax⩽0,g4(x→)=x1-x4⩽0,g5(x→)=P-Pc(x→)⩽0,g6(x→)=0.125-x1⩽0,g7(x→)=0.10471x12+0.04811x3x4(14.0+x2)-5.0⩽0(10) Variable range0.1⩽x1⩽2,0.1⩽x2⩽10,0.1⩽x3⩽10,0.1⩽x4⩽2 where τ(x→)=τ′2+2τ′τ″x22R+τ2″ τ′=P2x1x2 τ″=MRJ M=P(L+x22) R=x224+x1+x322 J=22x1x2x224x1+x322 σ(x→)=6PLx4x32 δ(x→)=4PL3Ex32x+x4 Pc(x→)=4.013Ex32x4636L21-x32LE4G P=6000lb,L=14in.,δmax=0.25in.,E=30×106psi, G=12×106psi,τmax=13,600psi,σmax=30,000psi Fig. 9. Open in new tabDownload slide Structure of welded beam design (Rao et al., 2011). Table 11 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 12 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, conventional GWO algorithm and with those found by other optimization algorithms. In terms of best result, CGWO outperforms GWO (Mirjalili et al., 2014), GA3 (Coello, 2000), GA4 (Coello and Montes, 2002), CPSO (He and Wang, 2007), SC (Ray and Liew, 2003), UPSO (Parsopoulos and Vrahatis, 2005) and CDE (He and Wang, 2007). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 11 Optimal solution of welded beam design problem by CGWO algorithm. Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Open in new tab Table 11 Optimal solution of welded beam design problem by CGWO algorithm. Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Open in new tab Table 12 Comparison results of welded beam design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 N.A. – Not Available. Open in new tab Table 12 Comparison results of welded beam design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 N.A. – Not Available. Open in new tab 6.4. Pressure vessel design problem Pressure vessel design problem is a classical engineering design problem whose main goal is to minimize the welding, manufacturing and material cost of the pressure vessel. There are a total of four decision variables involved in this problem which are thickness of shell (Ts) ⁠, thickness of head (Th) which are discrete decision variables, inner radius (R) and length of cylindrical section of the vessel (L) which are continuous decision variables. The diagrammatical representation of pressure vessel design problem is given in Fig. 10 showing variables of pressure vessel. The mathematical equations of the nonlinear objective function and constraints is represented in Eq. (11). The mentioned problem has four inequality constraints. Considerx→=[x1x2x3x4]=[TsThRL],Minimizef(x→)=0.6224x1x3x4+1.7781x2x32+3.1661x12x4+19.84x12x3,Subject tog1(x→)=-x1+0.0193x3⩽0,g2(x→)=-x2+0.00954x3⩽0,g3(x→)=-πx32x4-43πx33+1,296,000⩽0,g4(x→)=x4-240⩽0Variable range0⩽x1⩽100,0⩽x2⩽100,10⩽x3⩽200,10⩽x4⩽200(11) Fig. 10. Open in new tabDownload slide Structure of pressure vessel design problem (Rao et al., 2011). Table 13 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 14 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, conventional GWO algorithm and with those found by other optimization algorithms. It can be said from results that CGWO outperforms GWO (Mirjalili et al., 2014), (μ+λ)-ES (Mezura-Montes and Coello, 2005), CSA (Askarzadeh, 2016),HPSO (He and Wang, 2007), PSO-DE (Liu et al., 2010), ABC (Akay and Karaboga, 2012), TLBO (Rao et al., 2011), G-QPSO (dos Santos Coelho, 2010), QPSO (dos Santos Coelho, 2010), CDE (He and Wang, 2007), GA4 (Coello and Montes, 2002), CPSO (He and Wang, 2007), UPSO (Parsopoulos and Vrahatis, 2005), GA3 (Coello, 2000) and PSO (dos Santos Coelho, 2010). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 13 Optimal solution of pressure vessel design problem by CGWO algorithm. Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Open in new tab Table 13 Optimal solution of pressure vessel design problem by CGWO algorithm. Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Open in new tab Table 14 Comparison results of pressure vessel design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 N.A. – Not Available. Open in new tab Table 14 Comparison results of pressure vessel design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 N.A. – Not Available. Open in new tab 6.5. Closed coil helical spring design problem The main goal of this mechanical engineering design constrained problem is to minimize the volume of closed coil helical spring. Helical spring is made up of closed coiled wire having the shape of a helix and is intended for tensile and compressive load (Fig. 11, it can be seen that the coils of spring are so closed that the plane is at nearly right angles to the helix's axis and coil is subjected to torsion. This problem has a total of two decision variables namely coil diameter (D) and wire diameter (d) whose range is given in Eq. (13). The volume of the helical spring (U) can be minimized using the objective function given in Eq. (12). U=π24(Nc+2)Dd2(12) where 0.508⩽d⩽1.016,1.270⩽D⩽7.620,15⩽Nc⩽25(13) Fig. 11. Open in new tabDownload slide Structure of closed coil helical spring design problem (Savsani et al., 2010). The constrained problem of helical spring is subjected to eight constraints out of which first is stress constraint represented in Eq. (14) and second is configuration constraint given in Eq. (15). S-8CfFmaxDπd3⩾0,Cf=4C-14C-4+0.615C,C=Dd(14) Here, Fmax which is the maximum load and S, the shear stress allowed on the spring are set to 453.6 kg and 13288.02kgf/cm2 respectively. K=Gd48NcD3(15) where G is set to 808543.6kgf/cm2 and K is the spring constant. Next constraint is the length constraint expressed as given in Eq. (18) in which the maximum length lmax is equal to 35.56 cm. lf is the free length which can be calculated using Eq. (17). The deflection (δl) made in the spring due to maximum work load is also involved in calculating free length as given in Eq. (16). δl=FmaxK(16) lf=δl+1.05(Nc+2)d(17) lmax-lf⩾0(18) The wire diameter should also follow the constraint represented in Eq. (19) where dmin is set to 0.508 cm. d-dmin⩾0(19) The outer diameter of the coil (D) spring should also be less the maximum diameter specified (Dmax) which is 7.62 cm. Mathematically, it is expressed in Eq. (20). Dmax-(D+d)⩾0(20) The mean coil diameter (C) must also be at least three times the diameter of the wire as represented mathematically in Eq. (21). C-3⩾0(21) Next, the deflection occurs under the preload δp must also be less than its specified value δpm which is 15.24 cm as represented in Eq. (22). The preload deflection can be calculated using Eq. (23). δpm-δp⩾0(22) δp=FpK(23) Here, Fp is set to 136.08 kg. The combined deflection constraint is given in Eq. (24) which makes the deflection of the coil consistent with its length. lf-δp=Fmax-FpK-1.05(Nc+2)d⩾0(24) The next and the last constraint is subjected to the preload deflection of the spring which defines that it must be equal to its specified value (δω) which is equal to 3.175 cm. It is expressed in Eq. (25). Fmax-FpK-δω⩽0(25) Table 15 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 16 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, GWO algorithm and with those found by other optimization algorithms. It can be said from results that CGWO outperforms GWO (Mirjalili et al., 2014), DTLBO (Thamaraikannan and Thirunavukkarasu, 2014), TLBO (Rao et al., 2011), Conventional method (Hinze, 2005), PSO (He et al., 2004), ABS (Thamaraikannan and Thirunavukkarasu, 2014) and GA (Das and Pratihar, 2002). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 15 Optimal solution of closed coil helical spring design problem by CGWO algorithm. Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Open in new tab Table 15 Optimal solution of closed coil helical spring design problem by CGWO algorithm. Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Open in new tab Table 16 Comparison results of closed coil helical spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. N.A. – Not Available. Open in new tab Table 16 Comparison results of closed coil helical spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. N.A. – Not Available. Open in new tab 7. Conclusion and future scope The chaos theory and Grey Wolf Optimizer (GWO) are hybridized in order to design an improved meta-heuristic Chaotic Grey Wolf Optimization (CGWO) algorithm for constrained optimization problems. Various chaotic maps are used to regulate the key parameter, a, of GWO. The proposed CGWO is validated on thirteen constrained benchmark functions and five constrained engineering design problems. The chebyshev map is selected as its a through comparing various chaotic GWO variants to form the best CGWO. The simulations showed that the usage of deterministic chaotic signals instead of linearly decreasing values is an important modification of the GWO algorithm. Statistical results and success rates of the CGWO suggest that the tuned GWO clearly improves the reliability of the global optimality and they also enhanced the quality of the results. In comparison with other algorithms viz. FA, FPA, GWO and PSO, it seems the CGWO performed significantly well. The results of CGWO on constrained engineering problems showed its applicability for the complex real-world problems. The main reason of the superior performance of CGWO lies behind the chaos induced by the chaotic maps in the search space. This chaos helps the controlling parameter to find the optimal solution more quickly and thus refine the convergence rate of the algorithm. So, it can be easily concluded here that proposed CGWO can handle constrained problems effectively and efficiently. Further investigation on convergence analysis may prove fruitful. In addition, further topics of studies can also focus on the extension of the CGWO to solve mixed-type problems and discrete optimization problems. References Akay , B. , & Karaboga , D. ( 2012 ). Artificial bee colony algorithm for large-scale problems and engineering design optimization . Journal of Intelligent Manufacturing , 23 ( 4 ), 1001 – 1014 . Google Scholar Crossref Search ADS WorldCat Alavi , A. H. , & Gandomi , A. H. ( 2011 ). Prediction of principal ground-motion parameters using a hybrid method coupling artificial neural networks and simulated annealing . Computers & Structures , 89 ( 23 ), 2176 – 2194 . Google Scholar Crossref Search ADS WorldCat Alavi , A. H. , Gandomi , A. H., Sahab , M. G., & Gandomi , M. ( 2010 ). Multi expression programming: A new approach to formulation of soil classification . Engineering with Computers , 26 ( 2 ), 111 – 118 . Google Scholar Crossref Search ADS WorldCat Arora , S. , & Singh , S. ( 2013 ). A conceptual comparison of firefly algorithm, bat algorithm and cuckoo search. In International conference on control computing communication & materials (ICCCCM), 2013 (pp. 1 – 4 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Arora , S. , & Singh , S. ( 2015 ). Butterfly algorithm with levy flights for global optimization. In 2015 International conference on signal processing, computing and control (ISPCC) (pp. 220 – 224 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Arora , S. , & Singh , S. ( 2017 ). An improved butterfly optimization algorithm with chaos . Journal of Intelligent & Fuzzy Systems , 32 ( 1 ), 1079 – 1088 . Google Scholar Crossref Search ADS WorldCat Askarzadeh , A. ( 2016 ). A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm . Computers & Structures , 169 , 1 – 12 . Google Scholar Crossref Search ADS WorldCat Cagnina , L. C. , Esquivel , S. C., & Coello , C. A. C. ( 2008 ). Solving engineering optimization problems with the simple constrained particle swarm optimizer . Informatica , 32 ( 3 ). Coello , C. A. C. ( 2000 ). Use of a self-adaptive penalty approach for engineering optimization problems . Computers in Industry , 41 ( 2 ), 113 – 127 . Google Scholar Crossref Search ADS WorldCat Coello , C. A. C. , & Montes , E. M. ( 2002 ). Constraint-handling in genetic algorithms through the use of dominance-based tournament selection . Advanced Engineering Informatics , 16 ( 3 ), 193 – 203 . Google Scholar Crossref Search ADS WorldCat Das , A. , & Pratihar , D. ( 2002 ). Optimal design of machine elements using a genetic algorithm . Journal of the Institution of Engineers (India), Part MC, Mechanical Engineering Division , 83 ( 3 ), 97 – 104 . Google Scholar OpenURL Placeholder Text WorldCat Deb , K. ( 2000 ). An efficient constraint handling method for genetic algorithms . Computer Methods in Applied Mechanics and Engineering , 186 ( 2 ), 311 – 338 . Google Scholar Crossref Search ADS WorldCat dos Santos Coelho , L. ( 2010 ). Gaussian quantum-behaved particle swarm optimization approaches for constrained engineering design problems . Expert Systems with Applications , 37 ( 2 ), 1676 – 1683 . Google Scholar Crossref Search ADS WorldCat Emary , E. , Zawbaa , H. M., & Hassanien , A. E. ( 2016 ). Binary grey wolf optimization approaches for feature selection . Neurocomputing , 172 , 371 – 381 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , & Yang , X. -S. ( 2014 ). Chaotic bat algorithm . Journal of Computational Science , 5 ( 2 ), 224 – 232 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. , Yang , X. -S., Talatahari , S., & Alavi , A. ( 2013 ). Firefly algorithm with chaos . Communications in Nonlinear Science and Numerical Simulation , 18 ( 1 ), 89 – 98 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , Yun , G. J., Yang , X. -S., & Talatahari , S. ( 2013 ). Chaos-enhanced accelerated particle swarm optimization . Communications in Nonlinear Science and Numerical Simulation , 18 ( 2 ), 327 – 340 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , Yang , X. -S., Alavi , A. H., & Talatahari , S. ( 2013 ). Bat algorithm for constrained optimization tasks . Neural Computing and Applications , 22 ( 6 ), 1239 – 1255 . Google Scholar Crossref Search ADS WorldCat Gao , X. -Z. , Jokinen , T., Wang , X., Ovaska , S. J., & Arkkio , A. ( 2010 ). A new harmony search method in optimal wind generator design. In 2010 XIX international conference on electrical machines (ICEM) (pp. 1 – 6 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Gao , X. -Z. , Wang , X., Ovaska , S. J., & Xu , H. ( 2010 ). A modified harmony search method in constrained optimization . International Journal of Innovative Computing, Information and Control , 6 ( 9 ), 4235 – 4247 . Google Scholar OpenURL Placeholder Text WorldCat Han , X. , & Chang , X. ( 2012 ). A chaotic digital secure communication based on a modified gravitational search algorithm filter . Information Sciences , 208 , 14 – 27 . Google Scholar Crossref Search ADS WorldCat Han , X. , & Chang , X. ( 2013 ). An intelligent noise reduction method for chaotic signals based on genetic algorithms and lifting wavelet transforms . Information Sciences , 218 , 103 – 118 . Google Scholar Crossref Search ADS WorldCat He , Q. , & Wang , L. ( 2007 ). An effective co-evolutionary particle swarm optimization for constrained engineering design problems . Engineering Applications of Artificial Intelligence , 20 ( 1 ), 89 – 99 . Google Scholar Crossref Search ADS WorldCat He , Q. , & Wang , L. ( 2007 ). A hybrid particle swarm optimization with a feasibility-based rule for constrained optimization . Applied Mathematics and Computation , 186 ( 2 ), 1407 – 1422 . Google Scholar Crossref Search ADS WorldCat He , D. , He , C., Jiang , L. -G., Zhu , H. -W., & Hu , G. -R. ( 2001 ). Chaotic characteristics of a one-dimensional iterative map with infinite collapses . IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications , 48 ( 7 ), 900 – 906 . Google Scholar Crossref Search ADS WorldCat He , S. , Prempain , E., & Wu , Q. ( 2004 ). An improved particle swarm optimizer for mechanical design optimization problems . Engineering Optimization , 36 ( 5 ), 585 – 605 . Google Scholar Crossref Search ADS WorldCat Herskovits , J. ( 1986 ). A two-stage feasible directions algorithm for nonlinear constrained optimization . Mathematical Programming , 36 ( 1 ), 19 – 38 . Google Scholar Crossref Search ADS WorldCat Hinze , M. ( 2005 ). A variational discretization concept in control constrained optimization: The linear-quadratic case . Computational Optimization and Applications , 30 ( 1 ), 45 – 61 . Google Scholar Crossref Search ADS WorldCat Ho , Y. -C. , & Pepyne , D. L. ( 2002 ). Simple explanation of the no free lunch theorem of optimization . Cybernetics and Systems Analysis , 38 ( 2 ), 292 – 298 . Google Scholar Crossref Search ADS WorldCat Homaifar , A. , Qi , C. X., & Lai , S. H. ( 1994 ). Constrained optimization via genetic algorithms . Simulation , 62 ( 4 ), 242 – 253 . Google Scholar Crossref Search ADS WorldCat Jia , D. , Zheng , G., & Khan , M. K. ( 2011 ). An effective memetic differential evolution algorithm based on chaotic local search . Information Sciences , 181 ( 15 ), 3175 – 3187 . Google Scholar Crossref Search ADS WorldCat Joines , J. A. , & Houck , C. R. ( 1994 ). On the use of non-stationary penalty functions to solve nonlinear constrained optimization problems with GA's. In Proceedings of the first IEEE conference on evolutionary computation, 1994. IEEE world congress on computational intelligence . (pp. 579 – 584 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Karaboga , D. , & Akay , B. ( 2011 ). A modified artificial bee colony (abc) algorithm for constrained optimization problems . Applied Soft Computing , 11 ( 3 ), 3021 – 3031 . Google Scholar Crossref Search ADS WorldCat Karaboga , D. , & Basturk , B. ( 2007 ). Artificial bee colony (abc) optimization algorithm for solving constrained optimization problems. In Foundations of fuzzy logic and soft computing (pp. 789 – 798 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Kennedy , J. ( 2011 ). Particle swarm optimization. In Encyclopedia of machine learning (pp. 760 – 766 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Komaki , G. M. , & Kayvanfar , V. ( 2015 ). Grey wolf optimizer algorithm for the two-stage assembly flow shop scheduling problem with release time . Journal of Computational Science , 8 , 109 – 120 . Google Scholar Crossref Search ADS WorldCat Lee , K. S. , & Geem , Z. W. ( 2004 ). A new structural optimization method based on the harmony search algorithm . Computers & Structures , 82 ( 9 ), 781 – 798 . Google Scholar Crossref Search ADS WorldCat Liu , H. , Cai , Z., & Wang , Y. ( 2010 ). Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization . Applied Soft Computing , 10 ( 2 ), 629 – 640 . Google Scholar Crossref Search ADS WorldCat Luo , Q. , Ma , M., & Zhou , Y. ( 2016 ). A novel animal migration algorithm for global numerical optimization . Computer Science and Information Systems , 13 ( 1 ), 259 – 285 . Google Scholar Crossref Search ADS WorldCat Madadi , A. , & Motlagh , M. M. ( 2014 ). Optimal control of dc motor using grey wolf optimizer algorithm . TJEAS Journal , 4 ( 4 ), 373 – 379 . Google Scholar OpenURL Placeholder Text WorldCat Meng , X. -B. , Gao , X., Lu , L., Liu , Y., & Zhang , H. ( 2015 ). A new bio-inspired optimisation algorithm: Bird swarm algorithm . Journal of Experimental & Theoretical Artificial Intelligence , 1 – 15 . Meng , Z. , Pan , J. -S., & Alelaiwi , A. ( 2016 ). A new meta-heuristic ebb-tide-fish-inspired algorithm for traffic navigation . Telecommunication Systems , 62 ( 2 ), 403 – 415 . Google Scholar Crossref Search ADS WorldCat Mezura-Montes , E. , & Coello , C. A. C. ( 2005 ). A simple multimembered evolution strategy to solve constrained optimization problems . IEEE Transactions on Evolutionary Computation , 9 ( 1 ), 1 – 17 . Google Scholar Crossref Search ADS WorldCat Mezura-Montes , E. , & Coello , C. A. C. ( 2005 ). Useful infeasible solutions in engineering optimization with evolutionary algorithms. In MICAI 2005: Advances in artificial intelligence (pp. 652 – 662 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Mirjalili , S. , & Lewis , A. ( 2016 ). The whale optimization algorithm . Advances in Engineering Software , 95 , 51 – 67 . Google Scholar Crossref Search ADS WorldCat Mirjalili , S. , Mirjalili , S. M., & Lewis , A. ( 2014 ). Grey wolf optimizer . Advances in Engineering Software , 69 , 46 – 61 . Google Scholar Crossref Search ADS WorldCat Mirjalili , S. , Mirjalili , S. M., & Hatamlou , A. ( 2016 ). Multi-verse optimizer: A nature-inspired algorithm for global optimization . Neural Computing and Applications , 27 ( 2 ), 495 – 513 . Google Scholar Crossref Search ADS WorldCat Mohamed , A. -A. A. , El-Gaafary , A. A., Mohamed , Y. S., & Hemeida , A. M. ( 2015 ). Energy management with capacitor placement for economics low carbon emissions using modified multi-objective grey wolf optimizer . Muangkote , N. , Sunat , K., & Chiewchanwattana , S. ( 2014 ). An improved grey wolf optimizer for training q-Gaussian radial basis functional-link nets. In 2014 International computer science and engineering conference (ICSEC) (pp. 209 – 214 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Parsopoulos , K. E. , & Vrahatis , M. N. ( 2002 ). Particle swarm optimization method for constrained optimization problems . Intelligent Technologies–Theory and Application: New Trends in Intelligent Technologies , 76 , 214 – 220 . Google Scholar OpenURL Placeholder Text WorldCat Parsopoulos , K. E. , & Vrahatis , M. N. ( 2005 ). Unified particle swarm optimization for solving constrained engineering optimization problems. In Advances in natural computation (pp. 582 – 591 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Pecora , L. M. , & Carroll , T. L. ( 1990 ). Synchronization in chaotic systems . Physical Review Letters , 64 ( 8 ), 821 . Google Scholar Crossref Search ADS PubMed WorldCat Powell , M. J. ( 1978 ). A fast algorithm for nonlinearly constrained optimization calculations. In Numerical analysis (pp. 144 – 157 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Rao , R. ( 2016 ). Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems . International Journal of Industrial Engineering Computations , 7 ( 1 ), 19 – 34 . Google Scholar OpenURL Placeholder Text WorldCat Rao , R. V. , Savsani , V. J., & Vakharia , D. ( 2011 ). Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems . Computer-Aided Design , 43 ( 3 ), 303 – 315 . Google Scholar Crossref Search ADS WorldCat Ray , T. , & Liew , K. M. ( 2003 ). Society and civilization: An optimization algorithm based on the simulation of social behavior . IEEE Transactions on Evolutionary Computation , 7 ( 4 ), 386 – 396 . Google Scholar Crossref Search ADS WorldCat Sadollah , A. , Bahreininejad , A., Eskandar , H., & Hamdi , M. ( 2013 ). Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems . Applied Soft Computing , 13 ( 5 ), 2592 – 2612 . Google Scholar Crossref Search ADS WorldCat Saremi , S. , Mirjalili , S., & Lewis , A. ( 2014 ). Biogeography-based optimisation with chaos . Neural Computing and Applications , 25 ( 5 ), 1077 – 1097 . Google Scholar Crossref Search ADS WorldCat Savsani , V. , Rao , R., & Vakharia , D. ( 2010 ). Optimal weight design of a gear train using particle swarm optimization and simulated annealing algorithms . Mechanism and Machine Theory , 45 ( 3 ), 531 – 541 . Google Scholar Crossref Search ADS WorldCat Shi , Y. ( 2015 ). An optimization algorithm based on brainstorming process . Emerging Research on Swarm Intelligence and Algorithm Optimization , 1 – 35 . Shi , Y. , & Eberhart , R. ( 1998 ). A modified particle swarm optimizer. In The 1998 IEEE international conference on evolutionary computation proceedings, 1998. IEEE world congress on computational intelligence (pp. 69 – 73 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Talatahari , S. , Azar , B. F., Sheikholeslami , R., & Gandomi , A. ( 2012 ). Imperialist competitive algorithm combined with chaos for global optimization . Communications in Nonlinear Science and Numerical Simulation , 17 ( 3 ), 1312 – 1319 . Google Scholar Crossref Search ADS WorldCat Thamaraikannan , B. , & Thirunavukkarasu , V. ( 2014 ). Design optimization of mechanical components using an enhanced teaching-learning based optimization algorithm with differential operator . Mathematical Problems in Engineering , 2014. 10 pages. Wang , G. -G. , Guo , L., Gandomi , A. H., Hao , G. -S., & Wang , H. ( 2014 ). Chaotic krill herd algorithm . Information Sciences , 274 , 17 – 34 . Google Scholar Crossref Search ADS WorldCat Wang , G. -G. , Guo , L., Duan , H., & Wang , H. ( 2014 ). A new improved firefly algorithm for global numerical optimization . Journal of Computational and Theoretical Nanoscience , 11 ( 2 ), 477 – 485 . Google Scholar Crossref Search ADS WorldCat Wilcoxon , F. , Katti , S., & Wilcox , R. A. ( 1970 ). Critical values and probability levels for the Wilcoxon rank sum test and the Wilcoxon signed rank test . Selected Tables in Mathematical Statistics , 1 , 171 – 259 . Google Scholar OpenURL Placeholder Text WorldCat Yang , X. -S. ( 2012 ). Flower pollination algorithm for global optimization. In Unconventional computation and natural computation (pp. 240 – 249 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Yang , X. -S. , & algorithm , Firefly ( 2010 ). Levy flights and global optimization. In Research and development in intelligent systems XXVI (pp. 209 – 218 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Yang , D. , Li , G., & Cheng , G. ( 2007 ). On the efficiency of chaos optimization algorithms for global optimization . Chaos, Solitons & Fractals , 34 ( 4 ), 1366 – 1375 . Google Scholar Crossref Search ADS WorldCat Yang , X. -S. , Gandomi , A. H., Talatahari , S., & Alavi , A. H. ( 2012 ). Metaheuristics in water, geotechnical and transport engineering . Newnes . Footnotes Peer review under responsibility of Society for Computational Design and Engineering. Society for Computational Design and Engineering This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Society for Computational Design and Engineering http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Computational Design and Engineering Oxford University Press

Chaotic grey wolf optimization algorithm for constrained optimization problems

Chaotic grey wolf optimization algorithm for constrained optimization problems

Journal of Computational Design and Engineering , Volume 5 (4) – Oct 1, 2018

Abstract

Abstract The Grey Wolf Optimizer (GWO) algorithm is a novel meta-heuristic, inspired from the social hunting behavior of grey wolves. This paper introduces the chaos theory into the GWO algorithm with the aim of accelerating its global convergence speed. Firstly, detailed studies are carried out on thirteen standard constrained benchmark problems with ten different chaotic maps to find out the most efficient one. Then, the chaotic GWO is compared with the traditional GWO and some other popular meta-heuristics viz. Firefly Algorithm, Flower Pollination Algorithm and Particle Swarm Optimization algorithm. The performance of the CGWO algorithm is also validated using five constrained engineering design problems. The results showed that with an appropriate chaotic map, CGWO can clearly outperform standard GWO, with very good performance in comparison with other algorithms and in application to constrained optimization problems. Highlights Chaos has been introduced to the GWO to develop Chaotic GWO for global optimization. Ten chaotic maps have been investigated to tune the key parameter ‘a’, of GWO. Effectiveness of the algorithm is tested on many constrained benchmark functions. Results show CGWO's better performance over other nature-inspired optimization methods. The proposed CGWO is also used for some engineering design applications. Graphical Abstract Open in new tabDownload slide Chaotic grey wolf optimization, Firefly algorithm, Flower pollination algorithm, Particle swarm optimization algorithm 1. Introduction Constraints represent a feasible region which is nonempty and is filled with some restrictions or constraints to be followed by the solutions to solve a specific optimization problem (Karaboga and Akay, 2011). In general terms, constraints can be classified into equality constraints and inequality constraints which are represented in the form of mathematical equality and inequality equations respectively. Both types of constraints need to be satisfied by the problem's decision variables. Earlier, some deterministic methods like feasible direction approach and generalized gradient descent method were developed for solving constraint problems (Herskovits, 1986). However, due to their limited applicability and complexity of constraints, these were not effective for real world applications like structural optimization problems, economical optimization, location problems and engineering design problems like spring design, welded beam design, truss design, speed reducer design which involve many difficult equality and inequality constraints to be satisfied (Cagnina et al., 2008; Coello, 2000; Gandomi et al., 2013; Gao et al., 2010; Lee and Geem, 2004; Parsopoulos and Vrahatis, 2002). More and more meta-heuristic algorithms have been proposed to tackle these tough constrained optimization problems. These algorithms aim for tolerable velocity of convergence, a better precision, robustness, and performance. Some of the recent meta-heuristic algorithms proposed are Firefly Algorithm (FA) which is inspired by the flashing and attraction behavior of fireflies (Arora and Singh, 2013; Wang et al., 2014), Flower Pollination Algorithm (FPA) which is based on the characteristics of flowering of plants (Yang, 2012), Particle Swarm Optimization (PSO) which is inspired by the swarm behavior such as fish and bird schooling in nature (Shi and Eberhart, 1998), Bird Swarm Algorithm (BSA) which is based on the unique social interactions of bird swarms (Meng et al., 2015), ebb-tide-fish-inspired (ETFI) algorithm which is a simulation of fascinating characteristic of fish's perception of flow, sound and vibrations of tides in water (Meng et al., 2016), Jaya algorithm in which the main concept is to move the solution found so far towards the best solution and away from the worst solution (Rao, 2016), Grey Wolf Optimization (GWO) algorithm which is based on the social hunting behavior of grey wolves, Animal Migration Optimization (AMO) algorithm whose optimization process is mainly divided into two process viz. migration process and updating process with respect to animals (Luo et al., 2016), Butterfly Optimization Algorithm (BOA) which is inspired by the food foraging behavior of butterflies (Arora and Singh, 2015), Brain Storm Optimization (BSO) algorithm which is based on the simulation of brain storming process in humans (Shi, 2015), Whale Optimization Algorithm (WOA) which is inspired from the social interaction of humpback whales (Mirjalili and Lewis, 2016), Crow Search Algorithm (CSA) which mimics the clever characteristic of crows (Askarzadeh, 2016). Such meta-heuristics are being used extensively to solve complex problems like optimal wind generator design problem (Gao et al., 2010), formulation of soil classification (Alavi et al., 2010), prediction of ground soil parameters (Alavi and Gandomi, 2011). Some of the prominent metaheuristics of the literature which have already been used to tackle constrained problems are: Deb introduced a method to handle constraints using GA (Deb, 2000), Montes employed Differential Evolution (DE) Algorithm on constraint handling problems (Mezura-Montes and Coello, 2005), PSO was used by Cagnina to solve constrained optimization problems (Cagnina et al., 2008) and Karaboga used Artificial Bee Colony (ABC) algorithm to handle constraint mechanism (Karaboga and Basturk, 2007). GWO algorithm in fact, is a new meta-heuristic algorithm, inspired by the leadership behavior and unique mechanism of hunting of grey wolves. This population based meta-heuristic has the ability to avoid local optima stagnation to some extent (Yang et al., 2012). It also has good convergence ability towards the optima. In general, GWO advances itself strongly to exploitation. However, it cannot always implement global search well. Thus, in some cases, GWO fails to find global optimal solution. The search strategy used in basic GWO is mainly based on random walks. Thus, it cannot always deal with the problem successfully. With the development of the nonlinear dynamics, chaos theory has been widely used in several applications (Pecora and Carroll, 1990). In this context, one of the most famous applications is the introduction of chaos theory into the optimization methods (Yang et al., 2007). Up to now, the chaos theory has been successfully combined with several meta-heuristic optimization methods (Gandomi et al., 2013). Some major efforts in this area includes PSO (Gandomi et al., 2013), FA (Gandomi et al., 2013), BOA (Arora and Singh, 2017), GA (Han and Chang, 2013), hybridizing chaotic sequences with memetic differential evolution algorithm (Jia et al., 2011), imperialist competitive algorithm (Talatahari et al., 2012) and gravitational search algorithm (Han and Chang, 2012), Krill Herd (KH) algorithm (Wang et al., 2014), and Accelerated Particle Swarm Optimization (APSO) (Gandomi et al., 2013). In the present study, chaotic GWO (CGWO) algorithm is presented for the purpose of accelerating the convergence of GWO. Various one-dimensional chaotic maps are employed in place of the critical parameters used in GWO. Moreover, in order to examine the efficiency of the proposed CGWO in the room of constraint handling mechanism, it has been applied on some constrained benchmark functions and various classical engineering design problems viz. spring design problem, gear train design problem, welded beam design problem, pressure vessel design problem and closed coil helical spring design problem. The results of proposed CGWO on all the constrained benchmark functions have been compared with those obtained by GWO (Mirjalili et al., 2014), Firefly Algorithm (FA) (Yang and algorithm, 2010), Flower Pollination Algorithm (FPA) (Yang, 2012) and Particle Swarm Optimization (PSO) (Kennedy, 2011). On the other hand, the simulation results of all the classical engineering design problems have been compared with other state-of-art meta-heuristics discussed in the respective section. Organization of the remaining paper is as follows: Section 2: a brief introduction of GWO algorithm is given. Section 3: Description of the proposed CGWO algorithm is given in detail. Section 4: Validation of CGWO algorithm on thirteen constrained benchmark functions is performed. Section 5: Experimental study and discussion on results is done. Section 6: CGWO on various classical engineering design problems is described. Section 7: Conclusion of work along with its future scope is given. 2. Overview of grey wolf optimization algorithm Initially, Grey Wolf Optimizer (GWO) was introduced by S. Mirjalilli in year 2014 (Mirjalili et al., 2014). This algorithm is a simulation of unique hunting and searching prey characteristics of grey wolves. GWO has assumed the four level social hierarchy of grey wolves involving α at first, β at second, δ at third and ω wolves at last level. α wolves are the leader wolves managing and conducting the whole pack of grey wolves. It is also responsible for controlling the whole hunting process, taking all types of decisions like hunting, maintaining discipline, sleeping and waking time for whole pack. β wolf which is the best candidate to be the α ⁠, takes feedback from other wolves and give it to the α leader. The third level of grey wolves, i.e. δ wolves, dominate the wolves of forth and the last level called the ω wolves which are responsible for maintaining the safety and integrity in the wolf pack (Mirjalili et al., 2014). The distances from α ⁠, β and δ wolves i.e. Dα ⁠, Dβ and Dδ to each of the remaining wolf (X→) are calculated using Eq. (1) using which the effect of α ⁠, β and δ wolves on the prey viz. X1→ ⁠, X2→ and X3→ can be calculated as represented in Eq. (2). Dα→=C1→·Xα→-X→,Dβ→=C2→·Xβ→-X→,Dδ→=C3→·Xδ→-X→(1) X1→=Xα→-A1→·Dα→,X2→=Xβ→-A2→·Dβ→,X3→=Xδ→-A3→·Dδ→(2) A→=2a→·r1→-a→,C→=2·r2→(3) X→(t+1)=X1→+X2→+X3→3(4) The values of controlling parameters of the algorithm which are a ⁠, A and C are calculated using Eq. (3). Here, r1→ and r2→ are the random vectors in the range of [0,1] ⁠. These vectors make wolves able to reach at any point between the prey and the wolf. Vector a→ is involved in controlling activity of the GWO algorithm and used in calculating A→ ⁠. The component values of a→ vector decreases linearly from 2 to 0 over the courses of iterations (Mirjalili et al., 2014). C→ helps in putting some extra weight on the prey to make it difficult for the wolves to find it. Thus at last, all other wolves update their positions X→(t+1) using Eq. (4). In spite of being new comer, GWO is being used in many real world applications such as a modified version of GWO algorithm was proposed and applied successfully for training q-gaussian radial basis functional link nets (Muangkote et al., 2014), A modified GWO algorithm named multi-verse optimizer (MVO) for solving various optimization problems was proposed (Mirjalili et al., 2016), The binary version of GWO algorithm was proposed to be used for feature selection which was one of the important and crucial modification of GWO algorithm (Emary et al., 2016), A multi-objective GWO was modeled to minimize the gases emission level of CO2 by the capacitor in which 30-bus system was used for the evaluation of the proposed method (Mohamed et al., 2015), GWO algorithm was used to optimize the controlling parameters of DC motor (Madadi and Motlagh, 2014), The flowshop scheduling problem of stage 2 was solved along with the optimization of its release time by using GWO algorithm (Komaki and Kayvanfar, 2015). 3. Chaotic grey wolf optimization algorithm In spite of having good convergence rate, GWO still cannot always perform that well in finding global optima which affect the convergence rate of the algorithm. So, to reduce this affect and improve its efficiency, CGWO algorithm is developed by introducing chaos in GWO algorithm itself. In general terms, chaos is a deterministic, random- like method found in non-linear, dynamical system, which is non-period, non-converging and bounded. Mathematically, chaos is randomness of a simple deterministic dynamical system and chaotic system may be considered as sources of randomness. In order to introduce chaos in optimization algorithms, different chaotic maps having different mathematical equations are used. Since last decade, chaotic maps have been widely appreciated in the field of optimization due to their dynamic behavior which help optimization algorithms in exploring the search space more dynamically and globally. At a recent time, in accordance with different human's realm a wide variety of chaotic maps designed by physicians, researchers and mathematicians are available in the optimization field (Table 1. Table 1 Details of chaotic maps applied on CGWO. S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 a Using a = 0.5 and b = 0.2, it generates chaotic sequence in (0, 1), a = control parameter, xk = chaotic number at iteration ‘k’. Open in new tab Table 1 Details of chaotic maps applied on CGWO. S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 a Using a = 0.5 and b = 0.2, it generates chaotic sequence in (0, 1), a = control parameter, xk = chaotic number at iteration ‘k’. Open in new tab In these chaotic maps, any number in the range [0, 1] (or according to the range of chaotic map) can be chosen as the initial value. However, it should be noted that the initial value may have significant impacts on the fluctuation pattern of some of the chaotic maps. This set of chaotic maps has been chosen with different behaviors, while the initial value is 0.7 for all (Fig. 2. Fig. 1. Open in new tabDownload slide Flowchart of optimization procedure of CGWO. Fig. 2. Open in new tabDownload slide Pseudocode of proposed CGWO algorithm. The optimization procedure of the proposed CGWO algorithm is also presented in the form of flow chart given in Fig. 1. In this, first step involves the stochastic initialization of population of grey wolves. Then, a chaotic map is chosen to be mapped with the algorithm along with the initialization of its first chaotic number and a variable (Gandomi and Yang, 2014). Sequentially, the parameters of the CGWO algorithm involved in conducting the exploration - exploitation mechanism viz. a, A and C are initialized which are same as in GWO. Fitness of all grey wolves initialized in the search space are evaluated using various standard benchmark functions and are sorted according to their fitness. The first wolf got after sorting is assumed to be α wolf and accordingly second and third wolves are assumed as β and δ wolf respectively. Sequentially, the fitter wolf will keep updating its position using Eq. (4) and may get the position of α wolf as optimal solution. The parameters' values are also updated along with the course of iterations using Eq. (3). At the end of the last iteration, fitness of α wolf will be considered as the most optimal solution of the problem found by the CGWO algorithm. 4. CGWO for constrained benchmark functions All the constrained problems are formulated in the form of two functions i.e. objective function and constraint violation function (Powell, 1978). Objective function is the function whose main aim is to find the optimal solution say ‘x’ in the specified search space. It can be represented as Eq. (5). minimizef(x),x=(x1,x2,x3,…,xn)∈Rn(5) where n is the number of dimensions contained in a solution. x∈F∈S where F is the feasible region in the search space S which defines a n-dimensional rectangle R (Karaboga and Basturk, 2007). This rectangle R has domains size in the form of lower bound (lb) and upper bound (ub) as represented in Eq. (6). lb(i)⩽x(i)⩽ub(i),1⩽i⩽n(6) and the number of constraints say ‘ m(m>0) ’ are defined in the F space is the form of Eq. (7). gj(x)⩽0,forj=1,…,q,hj(x)=0,forj=q+1,…,m(7) Here, gj(x) and hj(x) are called as inequality and equality constraints respectively. If any solution say ‘x’ satisfies the constraint gk or hk in F space, then gk is considered to be an active constraint at x. 5. Experimental study and discussion 5.1. Parameter settings Among all the complex methods to calculate the penalty of constraints like iterative method, methods based on feasibility of solutions, simple penalty function method is used in all the constrained optimization problems implemented and discussed in this paper (Joines and Houck, 1994). The population size of grey wolves is taken 30 and 100 iterations are performed for the results of all the constrained benchmark functions. 30 Monte Carlo runs are executed on each of the constrained benchmark functions. For effective validation of the proposed CGWO algorithm in case of constrained benchmark functions, it has been compared with some other optimization algorithms which are GWO (Mirjalili et al., 2014), FA (Yang and algorithm, 2010), FPA (Yang, 2012) and PSO (Kennedy, 2011). Additionally, parameter settings for all these algorithms need to be done for impartial comparison which is one of the difficult task to perform during the execution. The parameter settings done in this work is like for PSO, global learning (no local neighborhoods), an inertial constant = 0.3, a cognitive constant = 1 and a social constant for swarm interaction = 1 is used. For FPA, λ=1.5 for Levy distribution function and proximity probability p = 0.8 is used. For FA, randomization parameter α = 0.6, attractiveness β0 = 1 and absorption coefficient γ=1.0 is used. For GWO, two random vectors r1→ and r2→ are taken in the range of (0, 1) and the controlling parameter a→ has linearly decreasing values from 2 to 0 over the course of iterations. For CGWO, values of two vectors r1→ and r2→ are taken randomly in the range of (0, 1) and the controlling parameter a→ has linearly decreasing values from 2 to 0 over the course of iterations, chaotic function variables a = 0.5 and b = 0.2 is used. Additionally, the best map performer for constrained optimization problems as per results, i.e., chebyshev map has been used on CGWO constraint handling mechanism. CGWO is implemented in C++ and compiled using Qt Creator 2.4.1 (MinGW) under Microsoft Windows 7 operating system. All simulations are carried out on a computer with an Intel(R) Core(TM) i5-3210@2.50 Ghz CPU. 5.2. Results and discussion In order to evaluate the capability of proposed CGWO for handling constrained problems, a set of thirteen widely used constrained benchmark functions have been used (Table 1. All the problems consist in the set have various linear, non-linear and quadratic equations in the form of equality and inequality constraints which are represented in Table 2. To choose the best possible map for all the constrained optimization problems, all the selected maps are applied on all the constrained benchmark functions whose results are provided in Table 3. According to the results, chebyshev map showed promising performance by outperforming seven out of thirteen constrained benchmark functions and thus is chosen for further investigation of CGWO on constrained optimization problems. Table 2 Details of constrained benchmark functions. Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Open in new tab Table 2 Details of constrained benchmark functions. Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Open in new tab Table 3 Results of 10 chaotic maps on all constrained benchmark functions on CGWO. Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Open in new tab Table 3 Results of 10 chaotic maps on all constrained benchmark functions on CGWO. Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Open in new tab The results of all the constrained benchmark functions applied on CGWO and other algorithms are given in Table 4. It can be easily seen from the results that CGWO has handle seven out of thirteen constrained benchmark functions very efficiently and thus has outperformed all other algorithms for these seven benchmark functions. PSO handled three constrained functions well among all. GWO performed superior to other for two constrained functions. FA outperformed for only one constrained function. The reason was that the chaotic maps with a unimode centered around the middle of maps, tend to produce better results, and chebyshev map fall into this category and it is indeed very effective. The results have also revealed the significant improvement of the proposed CGWO algorithm with the application of deterministic chaotic signals in place of constant value. Table 4 Comparison results of all constrained benchmark functions. Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Open in new tab Table 4 Comparison results of all constrained benchmark functions. Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Open in new tab 5.3. Graphical analysis For further effective evaluation of performance of all the algorithms, graphical analysis has also been done. The line graphs of convergence of various constrained benchmark functions using CGWO algorithm and other algorithms viz. GWO, FA, FPA and PSO have been shown from Figs. 3–6 which help to analyze the convergence rate of each algorithm more effectively. All these graphs have been taken on 100 iterations to clearly notice and analyze the convergence of all the algorithms. Fig. 3. Open in new tabDownload slide Comparison of five optimization algorithms for the G1 constrained benchmark function in 100 iterations. Fig. 4. Open in new tabDownload slide Comparison of five optimization algorithms for the G2 constrained benchmark function in 100 iterations. Fig. 5. Open in new tabDownload slide Comparison of five optimization algorithms for the G9 constrained benchmark function in 100 iterations. Fig. 6. Open in new tabDownload slide Comparison of five optimization algorithms for the G13 constrained benchmark function in 100 iterations. Fig. 3 shows the line graphs of convergence of all the five optimization algorithms applied on G1 test constrained benchmark function. From the graph, it can be seen that CGWO has the best performance for this benchmark function. It is showing superior performance of CGWO by reaching the optima for this test function within 10 iterations only. Further, it can be concluded from this graph that the GWO and FPA performed well when compared with other algorithms. FA demonstrates poor convergence in most of the optimization process, however it eventually ends the value of PSO. Fig. 4 demonstrates the line graphs of convergence of CGWO along with all other algorithms for G2 constrained benchmark function in which it is easily remarkable that CGWO is fastest of all in context of convergence towards the optima than that of FA, FPA, GWO and PSO. PSO has shown a competitive performance to CGWO for this problem and exhibits significant performance in terms of convergence speed. GWO, FA and FPA show a faster convergence rate initially, however they seem to be trapped into sub-optimal values as the optimization procedure proceeds. Fig. 5 illustrates the convergence rate on G9 test constrained benchmark function in which CGWO is demonstrating the high rate of convergence as compared to FPA, FA, PSO and GWO. The convergence line graph of FPA and PSO is showing slow convergence by giving constant fitness values for many iterations in between the 100 iterations as they seem to be trapped into sub-optimal values as the procedure proceeds, especially FPA. This demonstrates how CGWO is capable of balancing exploration and exploitation to find the global optimum rapidly and effectively. Fig. 6 is presenting the graphical view of convergence of all algorithms on G13 test constrained benchmark function in which it can be clearly seen that CGWO algorithm is nearest to the global optima of this constrained problem among algorithms viz. FPA, FA, GWO and PSO and it also shows fastest convergence of all. GWO and FPA illustrate poorer convergence than the other algorithms in the initial iterations. However, the search process is progressively accelerated during iterations for these algorithms. This indicates that the performance of CGWO can be boosted by the chaotic maps in terms of not only exploration but also exploitation. 5.4. Statistical testing Statistical testing is a process of making quantitative decisions about a problem in which statistical data set is evaluated and taken which is then compared hypothetically (Wilcoxon et al., 1970). The statistical testing of the constrained benchmark functions applied on all algorithms involved in this paper has been done using a widely used non parametric test named Wilcoxon signed rank-test discussed in Section 5.4.1. 5.4.1. Wilcoxon signed rank-test Wilcoxon signed rank-test is a statistical method which is solely based on the order of the sample's observations (Table 5 and their rank summary is provided in Table 6. Results depict that CGWO possessed lowest rank among all other optimization algorithms used in comparison for most of the benchmark functions which proves the superior performance of CGWO among other in comparison. However, PSO and GWO competed with CGWO closely and ranked second and third respectively. The superior performance of CGWO doesn't mean that it is superior than all other optimization algorithms present in the literature which will also lead to the violation of ‘free lunch theorem’ (Ho and Pepyne, 2002). Its performance simply signifies that it is better than other algorithms taken in this work only. Table 5 Pair-wise Wilcoxon signed rank test results. Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Open in new tab Table 5 Pair-wise Wilcoxon signed rank test results. Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Open in new tab Table 6 Rank summary of statistical assessment results. Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Open in new tab Table 6 Rank summary of statistical assessment results. Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Open in new tab 6. CGWO for classical engineering design problems Engineering design is a process of satisfying the needs involved in building a product. It is a decision making process which consists of a complex objective function and a large number of decision variables such as weight, strength and wear (Askarzadeh, 2016). Meta-heuristic methods come into being as an alternative to the traditional optimization methods. With their merits of finding acceptable solutions in an affordable time and being tolerant of non-convex and non-differentiable, meta-heuristic algorithms have attracted great research interest during the recent years. In real design problems the number of design variables can be very large, and their influence on the objective function to be optimized can be very complicated, with a nonlinear character. Therefore, in this paper, design considerations of five classical engineering design problems viz. spring design problem, gear train design problem, welded beam design problem, pressure vessel design problem and closed coil helical spring design problem have been done in Sections 6.1, 6.2, 6.3, 6.4, 6.5. These problems contain various local optima, whereas only global optimum is required. These problems cannot be handled by traditional methods which focus on local optima only. Hence, there is a need for effective and efficient optimization methods for these engineering design problems. In this section, various experiments on these benchmark problems are implemented to verify the performance of the proposed meta-heuristic CGWO method. In order to get an unbiased comparison of CPU times, all the experiments are performed over 30 independent runs for 500 iterations. 6.1. Tension/Compression spring design problem The main goal of this engineering design problem is to minimize the weight of the spring involving three decision variables which are wire diameter (d) ⁠, mean coil diameter (D) and number of active coils (N) ⁠. This problem is subjected to three inequality constraints and an objective function given in Eq. (8). Considerx→=[x1x2x3]=[dDN],Minimizef(x→)=(x3+2)x2x12,Subject tog1(x→)=1-x23x3717,854x14⩽0,g2(x→)=4x22-x1x212,566(x2x13-x14)+15108x12⩽0,g3(x→)=1-140.45x1x22x3⩽0,g4(x→)=x1+x21.5-1⩽0,(8) Variable range0.05⩽x1⩽2.00,0.25⩽x2⩽1.30,2.00⩽x3⩽15.00 Fig. 7. Fig. 7. Open in new tabDownload slide Structure of tension/spring design (Rao et al., 2011). 6.2. Gear train design problem The goal of this engineering design problem is to minimize the cost of gear ratio of the gear train whose schematic diagram is shown in Fig. 8. This problem has no equality or inequality constraints except a boundary constraint. It consists of four decision variables represented as nA(x1) ⁠, nB(x2) ⁠, nD(x3) ⁠, nF(x4) using which the gear ratio can be formulated as nBnD/nFnA ⁠. Mathematical formulation of the objective function of gear train design problem along with its boundary constraint is given in Eq. (9). Min.f(x)=((1/6.931)-(x3x2/x1x4))2S.t.12⩽xi⩽60(9) Fig. 8. Open in new tabDownload slide Structure of gear train design (Rao et al., 2011). Table 9 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 10 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm with conventional GWO algorithm and with those found by other optimization algorithms. In terms of optimal results, CGWO outperforms GWO (Mirjalili et al., 2014), CSA (Askarzadeh, 2016), UPSO (Parsopoulos and Vrahatis, 2005), ABC (Akay and Karaboga, 2012) and MBA (Sadollah et al., 2013). In terms of mean, CGWO gives better value than those obtained by all other algorithms in comparison. Table 7 Optimal solution of spring design problem by CGWO algorithm. Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Open in new tab Table 7 Optimal solution of spring design problem by CGWO algorithm. Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Open in new tab Table 8 Comparison results of spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 N.A. – Not Available. Open in new tab Table 8 Comparison results of spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 N.A. – Not Available. Open in new tab Table 9 Optimal solution of gear train design problem by CGWO algorithm. Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Open in new tab Table 9 Optimal solution of gear train design problem by CGWO algorithm. Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Open in new tab Table 10 Comparison results of gear train design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 N.A. – Not Available. Open in new tab Table 10 Comparison results of gear train design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 N.A. – Not Available. Open in new tab 6.3. Welded beam design problem Welded beam design problem which is a minimization problem has four variables namely weld thickness (h) ⁠, length of bar attached to the weld (l) ⁠, bar's height (t) ⁠, bar's thickness (b) as shown in Fig. 9. The constraints included in this problem are bending stress (⁠ θ ⁠), beam deflection (⁠ δ ⁠), shear stress (⁠ τ ⁠), buckling load (⁠ Pc ⁠) and other side constraints. The mathematical formulas related to this problem are represented in Eq. (10). Considerx→=[x1x2x3x4]=[hltb],Minimizef(x→)=1.10471x12x2+0.04811x3x4(14.0+x2),Subject tog1(x→)=τ(x→)-τmax⩽0,g2(x→)=σ(x→)-σmax⩽0,g3(x→)=δ(x→)-δmax⩽0,g4(x→)=x1-x4⩽0,g5(x→)=P-Pc(x→)⩽0,g6(x→)=0.125-x1⩽0,g7(x→)=0.10471x12+0.04811x3x4(14.0+x2)-5.0⩽0(10) Variable range0.1⩽x1⩽2,0.1⩽x2⩽10,0.1⩽x3⩽10,0.1⩽x4⩽2 where τ(x→)=τ′2+2τ′τ″x22R+τ2″ τ′=P2x1x2 τ″=MRJ M=P(L+x22) R=x224+x1+x322 J=22x1x2x224x1+x322 σ(x→)=6PLx4x32 δ(x→)=4PL3Ex32x+x4 Pc(x→)=4.013Ex32x4636L21-x32LE4G P=6000lb,L=14in.,δmax=0.25in.,E=30×106psi, G=12×106psi,τmax=13,600psi,σmax=30,000psi Fig. 9. Open in new tabDownload slide Structure of welded beam design (Rao et al., 2011). Table 11 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 12 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, conventional GWO algorithm and with those found by other optimization algorithms. In terms of best result, CGWO outperforms GWO (Mirjalili et al., 2014), GA3 (Coello, 2000), GA4 (Coello and Montes, 2002), CPSO (He and Wang, 2007), SC (Ray and Liew, 2003), UPSO (Parsopoulos and Vrahatis, 2005) and CDE (He and Wang, 2007). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 11 Optimal solution of welded beam design problem by CGWO algorithm. Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Open in new tab Table 11 Optimal solution of welded beam design problem by CGWO algorithm. Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Open in new tab Table 12 Comparison results of welded beam design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 N.A. – Not Available. Open in new tab Table 12 Comparison results of welded beam design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 N.A. – Not Available. Open in new tab 6.4. Pressure vessel design problem Pressure vessel design problem is a classical engineering design problem whose main goal is to minimize the welding, manufacturing and material cost of the pressure vessel. There are a total of four decision variables involved in this problem which are thickness of shell (Ts) ⁠, thickness of head (Th) which are discrete decision variables, inner radius (R) and length of cylindrical section of the vessel (L) which are continuous decision variables. The diagrammatical representation of pressure vessel design problem is given in Fig. 10 showing variables of pressure vessel. The mathematical equations of the nonlinear objective function and constraints is represented in Eq. (11). The mentioned problem has four inequality constraints. Considerx→=[x1x2x3x4]=[TsThRL],Minimizef(x→)=0.6224x1x3x4+1.7781x2x32+3.1661x12x4+19.84x12x3,Subject tog1(x→)=-x1+0.0193x3⩽0,g2(x→)=-x2+0.00954x3⩽0,g3(x→)=-πx32x4-43πx33+1,296,000⩽0,g4(x→)=x4-240⩽0Variable range0⩽x1⩽100,0⩽x2⩽100,10⩽x3⩽200,10⩽x4⩽200(11) Fig. 10. Open in new tabDownload slide Structure of pressure vessel design problem (Rao et al., 2011). Table 13 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 14 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, conventional GWO algorithm and with those found by other optimization algorithms. It can be said from results that CGWO outperforms GWO (Mirjalili et al., 2014), (μ+λ)-ES (Mezura-Montes and Coello, 2005), CSA (Askarzadeh, 2016),HPSO (He and Wang, 2007), PSO-DE (Liu et al., 2010), ABC (Akay and Karaboga, 2012), TLBO (Rao et al., 2011), G-QPSO (dos Santos Coelho, 2010), QPSO (dos Santos Coelho, 2010), CDE (He and Wang, 2007), GA4 (Coello and Montes, 2002), CPSO (He and Wang, 2007), UPSO (Parsopoulos and Vrahatis, 2005), GA3 (Coello, 2000) and PSO (dos Santos Coelho, 2010). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 13 Optimal solution of pressure vessel design problem by CGWO algorithm. Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Open in new tab Table 13 Optimal solution of pressure vessel design problem by CGWO algorithm. Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Open in new tab Table 14 Comparison results of pressure vessel design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 N.A. – Not Available. Open in new tab Table 14 Comparison results of pressure vessel design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 N.A. – Not Available. Open in new tab 6.5. Closed coil helical spring design problem The main goal of this mechanical engineering design constrained problem is to minimize the volume of closed coil helical spring. Helical spring is made up of closed coiled wire having the shape of a helix and is intended for tensile and compressive load (Fig. 11, it can be seen that the coils of spring are so closed that the plane is at nearly right angles to the helix's axis and coil is subjected to torsion. This problem has a total of two decision variables namely coil diameter (D) and wire diameter (d) whose range is given in Eq. (13). The volume of the helical spring (U) can be minimized using the objective function given in Eq. (12). U=π24(Nc+2)Dd2(12) where 0.508⩽d⩽1.016,1.270⩽D⩽7.620,15⩽Nc⩽25(13) Fig. 11. Open in new tabDownload slide Structure of closed coil helical spring design problem (Savsani et al., 2010). The constrained problem of helical spring is subjected to eight constraints out of which first is stress constraint represented in Eq. (14) and second is configuration constraint given in Eq. (15). S-8CfFmaxDπd3⩾0,Cf=4C-14C-4+0.615C,C=Dd(14) Here, Fmax which is the maximum load and S, the shear stress allowed on the spring are set to 453.6 kg and 13288.02kgf/cm2 respectively. K=Gd48NcD3(15) where G is set to 808543.6kgf/cm2 and K is the spring constant. Next constraint is the length constraint expressed as given in Eq. (18) in which the maximum length lmax is equal to 35.56 cm. lf is the free length which can be calculated using Eq. (17). The deflection (δl) made in the spring due to maximum work load is also involved in calculating free length as given in Eq. (16). δl=FmaxK(16) lf=δl+1.05(Nc+2)d(17) lmax-lf⩾0(18) The wire diameter should also follow the constraint represented in Eq. (19) where dmin is set to 0.508 cm. d-dmin⩾0(19) The outer diameter of the coil (D) spring should also be less the maximum diameter specified (Dmax) which is 7.62 cm. Mathematically, it is expressed in Eq. (20). Dmax-(D+d)⩾0(20) The mean coil diameter (C) must also be at least three times the diameter of the wire as represented mathematically in Eq. (21). C-3⩾0(21) Next, the deflection occurs under the preload δp must also be less than its specified value δpm which is 15.24 cm as represented in Eq. (22). The preload deflection can be calculated using Eq. (23). δpm-δp⩾0(22) δp=FpK(23) Here, Fp is set to 136.08 kg. The combined deflection constraint is given in Eq. (24) which makes the deflection of the coil consistent with its length. lf-δp=Fmax-FpK-1.05(Nc+2)d⩾0(24) The next and the last constraint is subjected to the preload deflection of the spring which defines that it must be equal to its specified value (δω) which is equal to 3.175 cm. It is expressed in Eq. (25). Fmax-FpK-δω⩽0(25) Table 15 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 16 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, GWO algorithm and with those found by other optimization algorithms. It can be said from results that CGWO outperforms GWO (Mirjalili et al., 2014), DTLBO (Thamaraikannan and Thirunavukkarasu, 2014), TLBO (Rao et al., 2011), Conventional method (Hinze, 2005), PSO (He et al., 2004), ABS (Thamaraikannan and Thirunavukkarasu, 2014) and GA (Das and Pratihar, 2002). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 15 Optimal solution of closed coil helical spring design problem by CGWO algorithm. Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Open in new tab Table 15 Optimal solution of closed coil helical spring design problem by CGWO algorithm. Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Open in new tab Table 16 Comparison results of closed coil helical spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. N.A. – Not Available. Open in new tab Table 16 Comparison results of closed coil helical spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. N.A. – Not Available. Open in new tab 7. Conclusion and future scope The chaos theory and Grey Wolf Optimizer (GWO) are hybridized in order to design an improved meta-heuristic Chaotic Grey Wolf Optimization (CGWO) algorithm for constrained optimization problems. Various chaotic maps are used to regulate the key parameter, a, of GWO. The proposed CGWO is validated on thirteen constrained benchmark functions and five constrained engineering design problems. The chebyshev map is selected as its a through comparing various chaotic GWO variants to form the best CGWO. The simulations showed that the usage of deterministic chaotic signals instead of linearly decreasing values is an important modification of the GWO algorithm. Statistical results and success rates of the CGWO suggest that the tuned GWO clearly improves the reliability of the global optimality and they also enhanced the quality of the results. In comparison with other algorithms viz. FA, FPA, GWO and PSO, it seems the CGWO performed significantly well. The results of CGWO on constrained engineering problems showed its applicability for the complex real-world problems. The main reason of the superior performance of CGWO lies behind the chaos induced by the chaotic maps in the search space. This chaos helps the controlling parameter to find the optimal solution more quickly and thus refine the convergence rate of the algorithm. So, it can be easily concluded here that proposed CGWO can handle constrained problems effectively and efficiently. Further investigation on convergence analysis may prove fruitful. In addition, further topics of studies can also focus on the extension of the CGWO to solve mixed-type problems and discrete optimization problems. References Akay , B. , & Karaboga , D. ( 2012 ). Artificial bee colony algorithm for large-scale problems and engineering design optimization . Journal of Intelligent Manufacturing , 23 ( 4 ), 1001 – 1014 . Google Scholar Crossref Search ADS WorldCat Alavi , A. H. , & Gandomi , A. H. ( 2011 ). Prediction of principal ground-motion parameters using a hybrid method coupling artificial neural networks and simulated annealing . Computers & Structures , 89 ( 23 ), 2176 – 2194 . Google Scholar Crossref Search ADS WorldCat Alavi , A. H. , Gandomi , A. H., Sahab , M. G., & Gandomi , M. ( 2010 ). Multi expression programming: A new approach to formulation of soil classification . Engineering with Computers , 26 ( 2 ), 111 – 118 . Google Scholar Crossref Search ADS WorldCat Arora , S. , & Singh , S. ( 2013 ). A conceptual comparison of firefly algorithm, bat algorithm and cuckoo search. In International conference on control computing communication & materials (ICCCCM), 2013 (pp. 1 – 4 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Arora , S. , & Singh , S. ( 2015 ). Butterfly algorithm with levy flights for global optimization. In 2015 International conference on signal processing, computing and control (ISPCC) (pp. 220 – 224 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Arora , S. , & Singh , S. ( 2017 ). An improved butterfly optimization algorithm with chaos . Journal of Intelligent & Fuzzy Systems , 32 ( 1 ), 1079 – 1088 . Google Scholar Crossref Search ADS WorldCat Askarzadeh , A. ( 2016 ). A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm . Computers & Structures , 169 , 1 – 12 . Google Scholar Crossref Search ADS WorldCat Cagnina , L. C. , Esquivel , S. C., & Coello , C. A. C. ( 2008 ). Solving engineering optimization problems with the simple constrained particle swarm optimizer . Informatica , 32 ( 3 ). Coello , C. A. C. ( 2000 ). Use of a self-adaptive penalty approach for engineering optimization problems . Computers in Industry , 41 ( 2 ), 113 – 127 . Google Scholar Crossref Search ADS WorldCat Coello , C. A. C. , & Montes , E. M. ( 2002 ). Constraint-handling in genetic algorithms through the use of dominance-based tournament selection . Advanced Engineering Informatics , 16 ( 3 ), 193 – 203 . Google Scholar Crossref Search ADS WorldCat Das , A. , & Pratihar , D. ( 2002 ). Optimal design of machine elements using a genetic algorithm . Journal of the Institution of Engineers (India), Part MC, Mechanical Engineering Division , 83 ( 3 ), 97 – 104 . Google Scholar OpenURL Placeholder Text WorldCat Deb , K. ( 2000 ). An efficient constraint handling method for genetic algorithms . Computer Methods in Applied Mechanics and Engineering , 186 ( 2 ), 311 – 338 . Google Scholar Crossref Search ADS WorldCat dos Santos Coelho , L. ( 2010 ). Gaussian quantum-behaved particle swarm optimization approaches for constrained engineering design problems . Expert Systems with Applications , 37 ( 2 ), 1676 – 1683 . Google Scholar Crossref Search ADS WorldCat Emary , E. , Zawbaa , H. M., & Hassanien , A. E. ( 2016 ). Binary grey wolf optimization approaches for feature selection . Neurocomputing , 172 , 371 – 381 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , & Yang , X. -S. ( 2014 ). Chaotic bat algorithm . Journal of Computational Science , 5 ( 2 ), 224 – 232 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. , Yang , X. -S., Talatahari , S., & Alavi , A. ( 2013 ). Firefly algorithm with chaos . Communications in Nonlinear Science and Numerical Simulation , 18 ( 1 ), 89 – 98 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , Yun , G. J., Yang , X. -S., & Talatahari , S. ( 2013 ). Chaos-enhanced accelerated particle swarm optimization . Communications in Nonlinear Science and Numerical Simulation , 18 ( 2 ), 327 – 340 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , Yang , X. -S., Alavi , A. H., & Talatahari , S. ( 2013 ). Bat algorithm for constrained optimization tasks . Neural Computing and Applications , 22 ( 6 ), 1239 – 1255 . Google Scholar Crossref Search ADS WorldCat Gao , X. -Z. , Jokinen , T., Wang , X., Ovaska , S. J., & Arkkio , A. ( 2010 ). A new harmony search method in optimal wind generator design. In 2010 XIX international conference on electrical machines (ICEM) (pp. 1 – 6 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Gao , X. -Z. , Wang , X., Ovaska , S. J., & Xu , H. ( 2010 ). A modified harmony search method in constrained optimization . International Journal of Innovative Computing, Information and Control , 6 ( 9 ), 4235 – 4247 . Google Scholar OpenURL Placeholder Text WorldCat Han , X. , & Chang , X. ( 2012 ). A chaotic digital secure communication based on a modified gravitational search algorithm filter . Information Sciences , 208 , 14 – 27 . Google Scholar Crossref Search ADS WorldCat Han , X. , & Chang , X. ( 2013 ). An intelligent noise reduction method for chaotic signals based on genetic algorithms and lifting wavelet transforms . Information Sciences , 218 , 103 – 118 . Google Scholar Crossref Search ADS WorldCat He , Q. , & Wang , L. ( 2007 ). An effective co-evolutionary particle swarm optimization for constrained engineering design problems . Engineering Applications of Artificial Intelligence , 20 ( 1 ), 89 – 99 . Google Scholar Crossref Search ADS WorldCat He , Q. , & Wang , L. ( 2007 ). A hybrid particle swarm optimization with a feasibility-based rule for constrained optimization . Applied Mathematics and Computation , 186 ( 2 ), 1407 – 1422 . Google Scholar Crossref Search ADS WorldCat He , D. , He , C., Jiang , L. -G., Zhu , H. -W., & Hu , G. -R. ( 2001 ). Chaotic characteristics of a one-dimensional iterative map with infinite collapses . IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications , 48 ( 7 ), 900 – 906 . Google Scholar Crossref Search ADS WorldCat He , S. , Prempain , E., & Wu , Q. ( 2004 ). An improved particle swarm optimizer for mechanical design optimization problems . Engineering Optimization , 36 ( 5 ), 585 – 605 . Google Scholar Crossref Search ADS WorldCat Herskovits , J. ( 1986 ). A two-stage feasible directions algorithm for nonlinear constrained optimization . Mathematical Programming , 36 ( 1 ), 19 – 38 . Google Scholar Crossref Search ADS WorldCat Hinze , M. ( 2005 ). A variational discretization concept in control constrained optimization: The linear-quadratic case . Computational Optimization and Applications , 30 ( 1 ), 45 – 61 . Google Scholar Crossref Search ADS WorldCat Ho , Y. -C. , & Pepyne , D. L. ( 2002 ). Simple explanation of the no free lunch theorem of optimization . Cybernetics and Systems Analysis , 38 ( 2 ), 292 – 298 . Google Scholar Crossref Search ADS WorldCat Homaifar , A. , Qi , C. X., & Lai , S. H. ( 1994 ). Constrained optimization via genetic algorithms . Simulation , 62 ( 4 ), 242 – 253 . Google Scholar Crossref Search ADS WorldCat Jia , D. , Zheng , G., & Khan , M. K. ( 2011 ). An effective memetic differential evolution algorithm based on chaotic local search . Information Sciences , 181 ( 15 ), 3175 – 3187 . Google Scholar Crossref Search ADS WorldCat Joines , J. A. , & Houck , C. R. ( 1994 ). On the use of non-stationary penalty functions to solve nonlinear constrained optimization problems with GA's. In Proceedings of the first IEEE conference on evolutionary computation, 1994. IEEE world congress on computational intelligence . (pp. 579 – 584 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Karaboga , D. , & Akay , B. ( 2011 ). A modified artificial bee colony (abc) algorithm for constrained optimization problems . Applied Soft Computing , 11 ( 3 ), 3021 – 3031 . Google Scholar Crossref Search ADS WorldCat Karaboga , D. , & Basturk , B. ( 2007 ). Artificial bee colony (abc) optimization algorithm for solving constrained optimization problems. In Foundations of fuzzy logic and soft computing (pp. 789 – 798 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Kennedy , J. ( 2011 ). Particle swarm optimization. In Encyclopedia of machine learning (pp. 760 – 766 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Komaki , G. M. , & Kayvanfar , V. ( 2015 ). Grey wolf optimizer algorithm for the two-stage assembly flow shop scheduling problem with release time . Journal of Computational Science , 8 , 109 – 120 . Google Scholar Crossref Search ADS WorldCat Lee , K. S. , & Geem , Z. W. ( 2004 ). A new structural optimization method based on the harmony search algorithm . Computers & Structures , 82 ( 9 ), 781 – 798 . Google Scholar Crossref Search ADS WorldCat Liu , H. , Cai , Z., & Wang , Y. ( 2010 ). Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization . Applied Soft Computing , 10 ( 2 ), 629 – 640 . Google Scholar Crossref Search ADS WorldCat Luo , Q. , Ma , M., & Zhou , Y. ( 2016 ). A novel animal migration algorithm for global numerical optimization . Computer Science and Information Systems , 13 ( 1 ), 259 – 285 . Google Scholar Crossref Search ADS WorldCat Madadi , A. , & Motlagh , M. M. ( 2014 ). Optimal control of dc motor using grey wolf optimizer algorithm . TJEAS Journal , 4 ( 4 ), 373 – 379 . Google Scholar OpenURL Placeholder Text WorldCat Meng , X. -B. , Gao , X., Lu , L., Liu , Y., & Zhang , H. ( 2015 ). A new bio-inspired optimisation algorithm: Bird swarm algorithm . Journal of Experimental & Theoretical Artificial Intelligence , 1 – 15 . Meng , Z. , Pan , J. -S., & Alelaiwi , A. ( 2016 ). A new meta-heuristic ebb-tide-fish-inspired algorithm for traffic navigation . Telecommunication Systems , 62 ( 2 ), 403 – 415 . Google Scholar Crossref Search ADS WorldCat Mezura-Montes , E. , & Coello , C. A. C. ( 2005 ). A simple multimembered evolution strategy to solve constrained optimization problems . IEEE Transactions on Evolutionary Computation , 9 ( 1 ), 1 – 17 . Google Scholar Crossref Search ADS WorldCat Mezura-Montes , E. , & Coello , C. A. C. ( 2005 ). Useful infeasible solutions in engineering optimization with evolutionary algorithms. In MICAI 2005: Advances in artificial intelligence (pp. 652 – 662 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Mirjalili , S. , & Lewis , A. ( 2016 ). The whale optimization algorithm . Advances in Engineering Software , 95 , 51 – 67 . Google Scholar Crossref Search ADS WorldCat Mirjalili , S. , Mirjalili , S. M., & Lewis , A. ( 2014 ). Grey wolf optimizer . Advances in Engineering Software , 69 , 46 – 61 . Google Scholar Crossref Search ADS WorldCat Mirjalili , S. , Mirjalili , S. M., & Hatamlou , A. ( 2016 ). Multi-verse optimizer: A nature-inspired algorithm for global optimization . Neural Computing and Applications , 27 ( 2 ), 495 – 513 . Google Scholar Crossref Search ADS WorldCat Mohamed , A. -A. A. , El-Gaafary , A. A., Mohamed , Y. S., & Hemeida , A. M. ( 2015 ). Energy management with capacitor placement for economics low carbon emissions using modified multi-objective grey wolf optimizer . Muangkote , N. , Sunat , K., & Chiewchanwattana , S. ( 2014 ). An improved grey wolf optimizer for training q-Gaussian radial basis functional-link nets. In 2014 International computer science and engineering conference (ICSEC) (pp. 209 – 214 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Parsopoulos , K. E. , & Vrahatis , M. N. ( 2002 ). Particle swarm optimization method for constrained optimization problems . Intelligent Technologies–Theory and Application: New Trends in Intelligent Technologies , 76 , 214 – 220 . Google Scholar OpenURL Placeholder Text WorldCat Parsopoulos , K. E. , & Vrahatis , M. N. ( 2005 ). Unified particle swarm optimization for solving constrained engineering optimization problems. In Advances in natural computation (pp. 582 – 591 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Pecora , L. M. , & Carroll , T. L. ( 1990 ). Synchronization in chaotic systems . Physical Review Letters , 64 ( 8 ), 821 . Google Scholar Crossref Search ADS PubMed WorldCat Powell , M. J. ( 1978 ). A fast algorithm for nonlinearly constrained optimization calculations. In Numerical analysis (pp. 144 – 157 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Rao , R. ( 2016 ). Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems . International Journal of Industrial Engineering Computations , 7 ( 1 ), 19 – 34 . Google Scholar OpenURL Placeholder Text WorldCat Rao , R. V. , Savsani , V. J., & Vakharia , D. ( 2011 ). Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems . Computer-Aided Design , 43 ( 3 ), 303 – 315 . Google Scholar Crossref Search ADS WorldCat Ray , T. , & Liew , K. M. ( 2003 ). Society and civilization: An optimization algorithm based on the simulation of social behavior . IEEE Transactions on Evolutionary Computation , 7 ( 4 ), 386 – 396 . Google Scholar Crossref Search ADS WorldCat Sadollah , A. , Bahreininejad , A., Eskandar , H., & Hamdi , M. ( 2013 ). Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems . Applied Soft Computing , 13 ( 5 ), 2592 – 2612 . Google Scholar Crossref Search ADS WorldCat Saremi , S. , Mirjalili , S., & Lewis , A. ( 2014 ). Biogeography-based optimisation with chaos . Neural Computing and Applications , 25 ( 5 ), 1077 – 1097 . Google Scholar Crossref Search ADS WorldCat Savsani , V. , Rao , R., & Vakharia , D. ( 2010 ). Optimal weight design of a gear train using particle swarm optimization and simulated annealing algorithms . Mechanism and Machine Theory , 45 ( 3 ), 531 – 541 . Google Scholar Crossref Search ADS WorldCat Shi , Y. ( 2015 ). An optimization algorithm based on brainstorming process . Emerging Research on Swarm Intelligence and Algorithm Optimization , 1 – 35 . Shi , Y. , & Eberhart , R. ( 1998 ). A modified particle swarm optimizer. In The 1998 IEEE international conference on evolutionary computation proceedings, 1998. IEEE world congress on computational intelligence (pp. 69 – 73 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Talatahari , S. , Azar , B. F., Sheikholeslami , R., & Gandomi , A. ( 2012 ). Imperialist competitive algorithm combined with chaos for global optimization . Communications in Nonlinear Science and Numerical Simulation , 17 ( 3 ), 1312 – 1319 . Google Scholar Crossref Search ADS WorldCat Thamaraikannan , B. , & Thirunavukkarasu , V. ( 2014 ). Design optimization of mechanical components using an enhanced teaching-learning based optimization algorithm with differential operator . Mathematical Problems in Engineering , 2014. 10 pages. Wang , G. -G. , Guo , L., Gandomi , A. H., Hao , G. -S., & Wang , H. ( 2014 ). Chaotic krill herd algorithm . Information Sciences , 274 , 17 – 34 . Google Scholar Crossref Search ADS WorldCat Wang , G. -G. , Guo , L., Duan , H., & Wang , H. ( 2014 ). A new improved firefly algorithm for global numerical optimization . Journal of Computational and Theoretical Nanoscience , 11 ( 2 ), 477 – 485 . Google Scholar Crossref Search ADS WorldCat Wilcoxon , F. , Katti , S., & Wilcox , R. A. ( 1970 ). Critical values and probability levels for the Wilcoxon rank sum test and the Wilcoxon signed rank test . Selected Tables in Mathematical Statistics , 1 , 171 – 259 . Google Scholar OpenURL Placeholder Text WorldCat Yang , X. -S. ( 2012 ). Flower pollination algorithm for global optimization. In Unconventional computation and natural computation (pp. 240 – 249 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Yang , X. -S. , & algorithm , Firefly ( 2010 ). Levy flights and global optimization. In Research and development in intelligent systems XXVI (pp. 209 – 218 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Yang , D. , Li , G., & Cheng , G. ( 2007 ). On the efficiency of chaos optimization algorithms for global optimization . Chaos, Solitons & Fractals , 34 ( 4 ), 1366 – 1375 . Google Scholar Crossref Search ADS WorldCat Yang , X. -S. , Gandomi , A. H., Talatahari , S., & Alavi , A. H. ( 2012 ). Metaheuristics in water, geotechnical and transport engineering . Newnes . Footnotes Peer review under responsibility of Society for Computational Design and Engineering. Society for Computational Design and Engineering This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Society for Computational Design and Engineering

Loading next page...
 
/lp/oxford-university-press/chaotic-grey-wolf-optimization-algorithm-for-constrained-optimization-VWntMkM0LI

References (75)

Publisher
Oxford University Press
Copyright
Copyright © 2022 Society for Computational Design and Engineering
ISSN
2288-4300
eISSN
2288-5048
DOI
10.1016/j.jcde.2017.02.005
Publisher site
See Article on Publisher Site

Abstract

Abstract The Grey Wolf Optimizer (GWO) algorithm is a novel meta-heuristic, inspired from the social hunting behavior of grey wolves. This paper introduces the chaos theory into the GWO algorithm with the aim of accelerating its global convergence speed. Firstly, detailed studies are carried out on thirteen standard constrained benchmark problems with ten different chaotic maps to find out the most efficient one. Then, the chaotic GWO is compared with the traditional GWO and some other popular meta-heuristics viz. Firefly Algorithm, Flower Pollination Algorithm and Particle Swarm Optimization algorithm. The performance of the CGWO algorithm is also validated using five constrained engineering design problems. The results showed that with an appropriate chaotic map, CGWO can clearly outperform standard GWO, with very good performance in comparison with other algorithms and in application to constrained optimization problems. Highlights Chaos has been introduced to the GWO to develop Chaotic GWO for global optimization. Ten chaotic maps have been investigated to tune the key parameter ‘a’, of GWO. Effectiveness of the algorithm is tested on many constrained benchmark functions. Results show CGWO's better performance over other nature-inspired optimization methods. The proposed CGWO is also used for some engineering design applications. Graphical Abstract Open in new tabDownload slide Chaotic grey wolf optimization, Firefly algorithm, Flower pollination algorithm, Particle swarm optimization algorithm 1. Introduction Constraints represent a feasible region which is nonempty and is filled with some restrictions or constraints to be followed by the solutions to solve a specific optimization problem (Karaboga and Akay, 2011). In general terms, constraints can be classified into equality constraints and inequality constraints which are represented in the form of mathematical equality and inequality equations respectively. Both types of constraints need to be satisfied by the problem's decision variables. Earlier, some deterministic methods like feasible direction approach and generalized gradient descent method were developed for solving constraint problems (Herskovits, 1986). However, due to their limited applicability and complexity of constraints, these were not effective for real world applications like structural optimization problems, economical optimization, location problems and engineering design problems like spring design, welded beam design, truss design, speed reducer design which involve many difficult equality and inequality constraints to be satisfied (Cagnina et al., 2008; Coello, 2000; Gandomi et al., 2013; Gao et al., 2010; Lee and Geem, 2004; Parsopoulos and Vrahatis, 2002). More and more meta-heuristic algorithms have been proposed to tackle these tough constrained optimization problems. These algorithms aim for tolerable velocity of convergence, a better precision, robustness, and performance. Some of the recent meta-heuristic algorithms proposed are Firefly Algorithm (FA) which is inspired by the flashing and attraction behavior of fireflies (Arora and Singh, 2013; Wang et al., 2014), Flower Pollination Algorithm (FPA) which is based on the characteristics of flowering of plants (Yang, 2012), Particle Swarm Optimization (PSO) which is inspired by the swarm behavior such as fish and bird schooling in nature (Shi and Eberhart, 1998), Bird Swarm Algorithm (BSA) which is based on the unique social interactions of bird swarms (Meng et al., 2015), ebb-tide-fish-inspired (ETFI) algorithm which is a simulation of fascinating characteristic of fish's perception of flow, sound and vibrations of tides in water (Meng et al., 2016), Jaya algorithm in which the main concept is to move the solution found so far towards the best solution and away from the worst solution (Rao, 2016), Grey Wolf Optimization (GWO) algorithm which is based on the social hunting behavior of grey wolves, Animal Migration Optimization (AMO) algorithm whose optimization process is mainly divided into two process viz. migration process and updating process with respect to animals (Luo et al., 2016), Butterfly Optimization Algorithm (BOA) which is inspired by the food foraging behavior of butterflies (Arora and Singh, 2015), Brain Storm Optimization (BSO) algorithm which is based on the simulation of brain storming process in humans (Shi, 2015), Whale Optimization Algorithm (WOA) which is inspired from the social interaction of humpback whales (Mirjalili and Lewis, 2016), Crow Search Algorithm (CSA) which mimics the clever characteristic of crows (Askarzadeh, 2016). Such meta-heuristics are being used extensively to solve complex problems like optimal wind generator design problem (Gao et al., 2010), formulation of soil classification (Alavi et al., 2010), prediction of ground soil parameters (Alavi and Gandomi, 2011). Some of the prominent metaheuristics of the literature which have already been used to tackle constrained problems are: Deb introduced a method to handle constraints using GA (Deb, 2000), Montes employed Differential Evolution (DE) Algorithm on constraint handling problems (Mezura-Montes and Coello, 2005), PSO was used by Cagnina to solve constrained optimization problems (Cagnina et al., 2008) and Karaboga used Artificial Bee Colony (ABC) algorithm to handle constraint mechanism (Karaboga and Basturk, 2007). GWO algorithm in fact, is a new meta-heuristic algorithm, inspired by the leadership behavior and unique mechanism of hunting of grey wolves. This population based meta-heuristic has the ability to avoid local optima stagnation to some extent (Yang et al., 2012). It also has good convergence ability towards the optima. In general, GWO advances itself strongly to exploitation. However, it cannot always implement global search well. Thus, in some cases, GWO fails to find global optimal solution. The search strategy used in basic GWO is mainly based on random walks. Thus, it cannot always deal with the problem successfully. With the development of the nonlinear dynamics, chaos theory has been widely used in several applications (Pecora and Carroll, 1990). In this context, one of the most famous applications is the introduction of chaos theory into the optimization methods (Yang et al., 2007). Up to now, the chaos theory has been successfully combined with several meta-heuristic optimization methods (Gandomi et al., 2013). Some major efforts in this area includes PSO (Gandomi et al., 2013), FA (Gandomi et al., 2013), BOA (Arora and Singh, 2017), GA (Han and Chang, 2013), hybridizing chaotic sequences with memetic differential evolution algorithm (Jia et al., 2011), imperialist competitive algorithm (Talatahari et al., 2012) and gravitational search algorithm (Han and Chang, 2012), Krill Herd (KH) algorithm (Wang et al., 2014), and Accelerated Particle Swarm Optimization (APSO) (Gandomi et al., 2013). In the present study, chaotic GWO (CGWO) algorithm is presented for the purpose of accelerating the convergence of GWO. Various one-dimensional chaotic maps are employed in place of the critical parameters used in GWO. Moreover, in order to examine the efficiency of the proposed CGWO in the room of constraint handling mechanism, it has been applied on some constrained benchmark functions and various classical engineering design problems viz. spring design problem, gear train design problem, welded beam design problem, pressure vessel design problem and closed coil helical spring design problem. The results of proposed CGWO on all the constrained benchmark functions have been compared with those obtained by GWO (Mirjalili et al., 2014), Firefly Algorithm (FA) (Yang and algorithm, 2010), Flower Pollination Algorithm (FPA) (Yang, 2012) and Particle Swarm Optimization (PSO) (Kennedy, 2011). On the other hand, the simulation results of all the classical engineering design problems have been compared with other state-of-art meta-heuristics discussed in the respective section. Organization of the remaining paper is as follows: Section 2: a brief introduction of GWO algorithm is given. Section 3: Description of the proposed CGWO algorithm is given in detail. Section 4: Validation of CGWO algorithm on thirteen constrained benchmark functions is performed. Section 5: Experimental study and discussion on results is done. Section 6: CGWO on various classical engineering design problems is described. Section 7: Conclusion of work along with its future scope is given. 2. Overview of grey wolf optimization algorithm Initially, Grey Wolf Optimizer (GWO) was introduced by S. Mirjalilli in year 2014 (Mirjalili et al., 2014). This algorithm is a simulation of unique hunting and searching prey characteristics of grey wolves. GWO has assumed the four level social hierarchy of grey wolves involving α at first, β at second, δ at third and ω wolves at last level. α wolves are the leader wolves managing and conducting the whole pack of grey wolves. It is also responsible for controlling the whole hunting process, taking all types of decisions like hunting, maintaining discipline, sleeping and waking time for whole pack. β wolf which is the best candidate to be the α ⁠, takes feedback from other wolves and give it to the α leader. The third level of grey wolves, i.e. δ wolves, dominate the wolves of forth and the last level called the ω wolves which are responsible for maintaining the safety and integrity in the wolf pack (Mirjalili et al., 2014). The distances from α ⁠, β and δ wolves i.e. Dα ⁠, Dβ and Dδ to each of the remaining wolf (X→) are calculated using Eq. (1) using which the effect of α ⁠, β and δ wolves on the prey viz. X1→ ⁠, X2→ and X3→ can be calculated as represented in Eq. (2). Dα→=C1→·Xα→-X→,Dβ→=C2→·Xβ→-X→,Dδ→=C3→·Xδ→-X→(1) X1→=Xα→-A1→·Dα→,X2→=Xβ→-A2→·Dβ→,X3→=Xδ→-A3→·Dδ→(2) A→=2a→·r1→-a→,C→=2·r2→(3) X→(t+1)=X1→+X2→+X3→3(4) The values of controlling parameters of the algorithm which are a ⁠, A and C are calculated using Eq. (3). Here, r1→ and r2→ are the random vectors in the range of [0,1] ⁠. These vectors make wolves able to reach at any point between the prey and the wolf. Vector a→ is involved in controlling activity of the GWO algorithm and used in calculating A→ ⁠. The component values of a→ vector decreases linearly from 2 to 0 over the courses of iterations (Mirjalili et al., 2014). C→ helps in putting some extra weight on the prey to make it difficult for the wolves to find it. Thus at last, all other wolves update their positions X→(t+1) using Eq. (4). In spite of being new comer, GWO is being used in many real world applications such as a modified version of GWO algorithm was proposed and applied successfully for training q-gaussian radial basis functional link nets (Muangkote et al., 2014), A modified GWO algorithm named multi-verse optimizer (MVO) for solving various optimization problems was proposed (Mirjalili et al., 2016), The binary version of GWO algorithm was proposed to be used for feature selection which was one of the important and crucial modification of GWO algorithm (Emary et al., 2016), A multi-objective GWO was modeled to minimize the gases emission level of CO2 by the capacitor in which 30-bus system was used for the evaluation of the proposed method (Mohamed et al., 2015), GWO algorithm was used to optimize the controlling parameters of DC motor (Madadi and Motlagh, 2014), The flowshop scheduling problem of stage 2 was solved along with the optimization of its release time by using GWO algorithm (Komaki and Kayvanfar, 2015). 3. Chaotic grey wolf optimization algorithm In spite of having good convergence rate, GWO still cannot always perform that well in finding global optima which affect the convergence rate of the algorithm. So, to reduce this affect and improve its efficiency, CGWO algorithm is developed by introducing chaos in GWO algorithm itself. In general terms, chaos is a deterministic, random- like method found in non-linear, dynamical system, which is non-period, non-converging and bounded. Mathematically, chaos is randomness of a simple deterministic dynamical system and chaotic system may be considered as sources of randomness. In order to introduce chaos in optimization algorithms, different chaotic maps having different mathematical equations are used. Since last decade, chaotic maps have been widely appreciated in the field of optimization due to their dynamic behavior which help optimization algorithms in exploring the search space more dynamically and globally. At a recent time, in accordance with different human's realm a wide variety of chaotic maps designed by physicians, researchers and mathematicians are available in the optimization field (Table 1. Table 1 Details of chaotic maps applied on CGWO. S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 a Using a = 0.5 and b = 0.2, it generates chaotic sequence in (0, 1), a = control parameter, xk = chaotic number at iteration ‘k’. Open in new tab Table 1 Details of chaotic maps applied on CGWO. S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 S. no. . Map name . Map equation . 1 Bernoulli map xk+1=xk1-a0⩽xk⩽axk-(1-a)a(1-a)⩽xk⩽1 2 Logistic map xk+1=a.xk(1-xk) 3 Chebyshev map xk+1=cos(a·cos-1xk) 4 Circle mapa xk+1=xk+b-a2πsin2πxkmod(1) 5 Cubic map xk+1=ρ1-xk2,xk∈(0,1) 6 Iterative chaotic map with infinite collapses (ICMIC) map xk+1=abssinaxk,a∈(0,1) 7 Piecewise map xk+1=xka0⩽xk⩽axk-a0.5-aa⩽xk⩽0.51-a-xk0.5-a0.5⩽xk⩽1-a1-xka1-a⩽xk⩽1 8 Singer map xk+1=a(7.86xk-23.31xk2+28.75xk3-13.302875xk4) 9 Sinusoidal map xk+1=a.xk2sin(πxk) 10 Tent map xk+1=xk/0.7xk<0.710/3(1-xk)xk⩾0.7 a Using a = 0.5 and b = 0.2, it generates chaotic sequence in (0, 1), a = control parameter, xk = chaotic number at iteration ‘k’. Open in new tab In these chaotic maps, any number in the range [0, 1] (or according to the range of chaotic map) can be chosen as the initial value. However, it should be noted that the initial value may have significant impacts on the fluctuation pattern of some of the chaotic maps. This set of chaotic maps has been chosen with different behaviors, while the initial value is 0.7 for all (Fig. 2. Fig. 1. Open in new tabDownload slide Flowchart of optimization procedure of CGWO. Fig. 2. Open in new tabDownload slide Pseudocode of proposed CGWO algorithm. The optimization procedure of the proposed CGWO algorithm is also presented in the form of flow chart given in Fig. 1. In this, first step involves the stochastic initialization of population of grey wolves. Then, a chaotic map is chosen to be mapped with the algorithm along with the initialization of its first chaotic number and a variable (Gandomi and Yang, 2014). Sequentially, the parameters of the CGWO algorithm involved in conducting the exploration - exploitation mechanism viz. a, A and C are initialized which are same as in GWO. Fitness of all grey wolves initialized in the search space are evaluated using various standard benchmark functions and are sorted according to their fitness. The first wolf got after sorting is assumed to be α wolf and accordingly second and third wolves are assumed as β and δ wolf respectively. Sequentially, the fitter wolf will keep updating its position using Eq. (4) and may get the position of α wolf as optimal solution. The parameters' values are also updated along with the course of iterations using Eq. (3). At the end of the last iteration, fitness of α wolf will be considered as the most optimal solution of the problem found by the CGWO algorithm. 4. CGWO for constrained benchmark functions All the constrained problems are formulated in the form of two functions i.e. objective function and constraint violation function (Powell, 1978). Objective function is the function whose main aim is to find the optimal solution say ‘x’ in the specified search space. It can be represented as Eq. (5). minimizef(x),x=(x1,x2,x3,…,xn)∈Rn(5) where n is the number of dimensions contained in a solution. x∈F∈S where F is the feasible region in the search space S which defines a n-dimensional rectangle R (Karaboga and Basturk, 2007). This rectangle R has domains size in the form of lower bound (lb) and upper bound (ub) as represented in Eq. (6). lb(i)⩽x(i)⩽ub(i),1⩽i⩽n(6) and the number of constraints say ‘ m(m>0) ’ are defined in the F space is the form of Eq. (7). gj(x)⩽0,forj=1,…,q,hj(x)=0,forj=q+1,…,m(7) Here, gj(x) and hj(x) are called as inequality and equality constraints respectively. If any solution say ‘x’ satisfies the constraint gk or hk in F space, then gk is considered to be an active constraint at x. 5. Experimental study and discussion 5.1. Parameter settings Among all the complex methods to calculate the penalty of constraints like iterative method, methods based on feasibility of solutions, simple penalty function method is used in all the constrained optimization problems implemented and discussed in this paper (Joines and Houck, 1994). The population size of grey wolves is taken 30 and 100 iterations are performed for the results of all the constrained benchmark functions. 30 Monte Carlo runs are executed on each of the constrained benchmark functions. For effective validation of the proposed CGWO algorithm in case of constrained benchmark functions, it has been compared with some other optimization algorithms which are GWO (Mirjalili et al., 2014), FA (Yang and algorithm, 2010), FPA (Yang, 2012) and PSO (Kennedy, 2011). Additionally, parameter settings for all these algorithms need to be done for impartial comparison which is one of the difficult task to perform during the execution. The parameter settings done in this work is like for PSO, global learning (no local neighborhoods), an inertial constant = 0.3, a cognitive constant = 1 and a social constant for swarm interaction = 1 is used. For FPA, λ=1.5 for Levy distribution function and proximity probability p = 0.8 is used. For FA, randomization parameter α = 0.6, attractiveness β0 = 1 and absorption coefficient γ=1.0 is used. For GWO, two random vectors r1→ and r2→ are taken in the range of (0, 1) and the controlling parameter a→ has linearly decreasing values from 2 to 0 over the course of iterations. For CGWO, values of two vectors r1→ and r2→ are taken randomly in the range of (0, 1) and the controlling parameter a→ has linearly decreasing values from 2 to 0 over the course of iterations, chaotic function variables a = 0.5 and b = 0.2 is used. Additionally, the best map performer for constrained optimization problems as per results, i.e., chebyshev map has been used on CGWO constraint handling mechanism. CGWO is implemented in C++ and compiled using Qt Creator 2.4.1 (MinGW) under Microsoft Windows 7 operating system. All simulations are carried out on a computer with an Intel(R) Core(TM) i5-3210@2.50 Ghz CPU. 5.2. Results and discussion In order to evaluate the capability of proposed CGWO for handling constrained problems, a set of thirteen widely used constrained benchmark functions have been used (Table 1. All the problems consist in the set have various linear, non-linear and quadratic equations in the form of equality and inequality constraints which are represented in Table 2. To choose the best possible map for all the constrained optimization problems, all the selected maps are applied on all the constrained benchmark functions whose results are provided in Table 3. According to the results, chebyshev map showed promising performance by outperforming seven out of thirteen constrained benchmark functions and thus is chosen for further investigation of CGWO on constrained optimization problems. Table 2 Details of constrained benchmark functions. Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Open in new tab Table 2 Details of constrained benchmark functions. Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Problem . Type . Objective function . Constraints . Bounds . Optima . No. of variables . G1 Min fx=5∑i=14xi-5∑i=14xi2-∑i=513xi g1x=2x1+2x2+x10+x11-10⩽0 ⁠, L=(0,0,…,0) ⁠, U=(1,1,1,1,1,1,1,1,1,100,100,100,1) −15 13 g2x=2x1+2x3+x10+x12-10⩽0 ⁠, g3x=2x2+2x3+x11+x12-10⩽0 ⁠, g4x=-8x1+x10⩽0 ⁠, g5x=-8x2+x11⩽0 ⁠, g6x=-8x3+x12⩽0 ⁠, g7x=-2x4-x5+x10⩽0 ⁠, g8x=-2x6-x7+x11⩽0 ⁠, g9x=-2x8-x9+x12⩽0 G2 Min fx=∑i=1ncos4xi-2∏i=1ncos2(xi)∑i=1nixi2 g1x=-∏i=1nxi+0.75⩽0 ⁠, L=0 ⁠, U=10 −0.803619 20 g2x=∑i=1nxi-7.5n⩽0 G3 Max fx=nn∏i=1nxi g1x=∑i=1nxi2-1 U = 1 −1 20 G4 Min fx=5.3578547x32+0.8356891x1x5+37.293239x1-40792.141 g1x=ux-92⩽0 ⁠, L=(78,33,27,27,27),U=(102,45,45,45,45) −30665.539 5 g2x=ux⩽0 ⁠, g3x=vx-110⩽0 ⁠, g4x=-vx+90⩽0 ⁠, g5x=wx-25⩽0 ⁠, g6x=-wx+20⩽0 where ux=85.334407+0.0056858x2x5+0.0006262x1x5+0.0022053x3x5 ⁠, vx=80.51249+0.0071317x2x5+0.0029955x1x2+0.002181x32 ⁠, wx=9.300961+0.0047026x3x5+0.0012547x1x3+0.0019085x3x4 G5 Min fx=3x1+10-6x13+2x2+23×10-6x23 g1x=x3-x4-0.55⩽0 ⁠, L=(0,0,-0.55,-0.55),U=(1200,1200,0.55,0.55) 5126.4981 4 g2x=x4-x3-0.55⩽0 ⁠, h1x=1000sin-x3-0.25+sin-x4-0.25+894.8-x1=0 ⁠, h2x=1000sinx3-0.25+sinx3-x4-0.25+894.8-x2=0 h3x=1000sinx4-0.25+sinx4-x3-0.25+1294.8=0 G6 Min fx=(x1-10)3+(x2-20)3 g1x=(x1-5)2+(x2-5)2+100⩽0 ⁠, L=(13,0) −6961.81388 2 g2x=(x1-5)2+(x2-5)2-82.81⩽0 U=(100,100) G7 Min fx=x12+x22+x1x2-14x1-16x2+x3-102+4x4-52+x5-32+2x6-12+5x72+7x8-112+2x9-102+x10-72+45 g1x=4x1+5x2-3x7+9x8-105⩽0 ⁠, L=(-10,…,-10),U=(10,…,10) 24.306209 10 g2x=10x1-8x2-17x7+2x8⩽0 ⁠, g315x=-8x1+2x2+5x9-2x10-12⩽0 ⁠, g4x=3x1-22+4x2-32+2x32-7x4-120⩽0 ⁠, g5x=5x12+8x2+(x3-6)2-2x4-40⩽0 ⁠. g6x=0.5(x1-8)2+2(x2-4)2+3x52-x6-30⩽0 ⁠, g7x=x12+2(x2-2)2-2x1x2+14x5-6x6⩽0 ⁠, g8x=-3x1+6x2+12x9-82-7x10⩽0 ⁠. G8 Max fx=sin3(2πx1)sin(2πx2)x13(x1+x2) g1x=x12-x2+1⩽0 ⁠, L=(0,0) −0.095825 2 g2x=1-x1+(x2-4)2⩽0 U=(10,10) G9 Min fx=(x1-10)2+5(x2-12)2+x34+3(x4-11)2+10x56+7x62+x74-4x6x7-10x6-8x7 g1x=2x12+3x24+x3+4x42+5x5-127⩽0 ⁠, 680.63005 7 g2x=7x1+3x2+10x32+x4-x5-282⩽0 ⁠, L=(-10,…,-10) g3x=23x1+x22+6x62-8x7-196⩽0 ⁠, U=(10,…,10) g4x=4x12+x22-3x1x2+2x32+5x6-11x7⩽0 G10 Min fx=x1+x2+x3 g1x=-1+0.0025(x4+x6)⩽0 ⁠, L=(100,1000,1000,10,10,10,10,10)U=(10,000,10,000,10,000,1000,1000,1000,1000,1000) 7049.3307 8 g2x=-1+0.0025(-x4+x5+x7)⩽0 ⁠, g3x=-1+0.01(-x5+x8)⩽0 ⁠, g4x=100x1-x1x6+833.33252x4-83333.333⩽0 ⁠, g5x=x2x4-x2x7-1250x4+1250x5⩽0 ⁠, g6x=x3x5-x3x8-2500x5+1,250,000⩽0 G11 Min fx=x12+(x2-1)2 h1x=x2-x12=0 L=(-1,-1)U=(1,1) 0.75 2 G12 Min fx=1-0.01[x1-52+x2-52+(x3-5)2] gi,j,kx=(x1-i)2+(x2-j)2+(x3-k)2-0.0625⩽i,j,k=1,2,…9 L=(0,0,0)U=(10,10,10) −1 3 G13 Min fx=ex1x2x3x4x5 h1x=x12+x22+x32+x42+x52-10=0 ⁠, L=-2.3U=2.3 0.0539498 5 h2x=x2x3-5x4x5=0 ⁠, h3x=x13+x23+1=0 Open in new tab Table 3 Results of 10 chaotic maps on all constrained benchmark functions on CGWO. Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Open in new tab Table 3 Results of 10 chaotic maps on all constrained benchmark functions on CGWO. Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Problem . Bernoulli . Logistic . Chebyshev . Circle . Cubic . Icmic . Peicewise . Singer . Sinusoidal . Tent . G1 −13.1952 −12.9301 −14.8008 −10.1684 −15.9102 −14.4705 −12.1389 −13.2742 −14.2749 −14.3466 G2 −0.42058 −0.567739 −0.79434 −0.24189 −0.51969 −0.475268 −0.460454 −0.33466 −0.38927 −0.46595 G3 −0.83865 −0.508866 −0.9681 −0.18356 −0.57743 −0.114344 −0.78943 −0.89999 −0.89258 −0.79427 G4 −33250.4 −32906.7 −32675.2 −32212.6 −31462.7 −33479.2 −32375.1 −30902.2 −31044.9 −31691.2 G5 53772.1 40974.8 54914.1 197252 39197.2 23919.7 58004.5 65223.1 45457.1 26868.2 G6 −6289.29 −6349.81 −6493.18 −6301.84 −6582.48 −6447.12 −6349.86 −6349.35 −6229.18 −6379.28 G7 130.1639 629.2649 60.2278 60.3222 36.1793 38.367 689.2759 228.2740 649.2649 928.1649 G8 −0.04561 −0.075206 −503.093 −0.05272 −0.07348 −0.09212 −0.09485 −0.06385 −0.02462 −0.09566 G9 602.173 612.460 676.670 612.370 628.40 665.643 607.135 614.274 629.153 657.12 G10 6994.23 6045.14 7046.13 6027.24 7034.43 7060.12 7024.24 7029.26 7013.17 7010.43 G11 0.6260 0.6250 0.6610 0.6400 0.6390 0.6680 0.6420 0.6250 0.6280 0.6390 G12 −73.2839 −70.1368 −48.2570 −82.3629 −193.363 −163.368 −273.368 −468.478 −2698.378 −2738.36 G13 0.4832 0.923643 0.5759 0.8933 0.45489 0.8935 1.3638 0.93538 1.353829 0.4678 Open in new tab The results of all the constrained benchmark functions applied on CGWO and other algorithms are given in Table 4. It can be easily seen from the results that CGWO has handle seven out of thirteen constrained benchmark functions very efficiently and thus has outperformed all other algorithms for these seven benchmark functions. PSO handled three constrained functions well among all. GWO performed superior to other for two constrained functions. FA outperformed for only one constrained function. The reason was that the chaotic maps with a unimode centered around the middle of maps, tend to produce better results, and chebyshev map fall into this category and it is indeed very effective. The results have also revealed the significant improvement of the proposed CGWO algorithm with the application of deterministic chaotic signals in place of constant value. Table 4 Comparison results of all constrained benchmark functions. Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Open in new tab Table 4 Comparison results of all constrained benchmark functions. Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Problem . CGWO . GWO . FPA . FA . PSO . G1 −14.8008 −14.3159 −12.4265 −67.6314 −14.0273 G2 −0.79434 −0.31375 −0.30612 −0.517728 −0.65436 G3 −0.9681 −0.83910 −0.82839 −1.99369 −0.78568 G4 −32675.2 −33141.1 −33350.1 −30446.7 −32212.1 G5 54914.1 43924.93 58282.2 97119.4 79388.5 G6 −6493.18 −6265.65 −6346.38 −6349.86 −6248.57 G7 60.2278 42.1324 39.0470 27.6540 24.1480 G8 −503.093 −672.078 −10.7468 −11.0266 −0.03440 G9 676.670 603.816 813.734 680.438 680.617 G10 7046.13 6653.97 2821.31 6091.50 4691.59 G11 0.66100 0.693021 0.62507 0.62500 0.62508 G12 −48.2570 −47.3590 −72.9248 −53.2563 −1378.90 G13 0.57590 1.09478 0.67968 0.856731 0.82005 Open in new tab 5.3. Graphical analysis For further effective evaluation of performance of all the algorithms, graphical analysis has also been done. The line graphs of convergence of various constrained benchmark functions using CGWO algorithm and other algorithms viz. GWO, FA, FPA and PSO have been shown from Figs. 3–6 which help to analyze the convergence rate of each algorithm more effectively. All these graphs have been taken on 100 iterations to clearly notice and analyze the convergence of all the algorithms. Fig. 3. Open in new tabDownload slide Comparison of five optimization algorithms for the G1 constrained benchmark function in 100 iterations. Fig. 4. Open in new tabDownload slide Comparison of five optimization algorithms for the G2 constrained benchmark function in 100 iterations. Fig. 5. Open in new tabDownload slide Comparison of five optimization algorithms for the G9 constrained benchmark function in 100 iterations. Fig. 6. Open in new tabDownload slide Comparison of five optimization algorithms for the G13 constrained benchmark function in 100 iterations. Fig. 3 shows the line graphs of convergence of all the five optimization algorithms applied on G1 test constrained benchmark function. From the graph, it can be seen that CGWO has the best performance for this benchmark function. It is showing superior performance of CGWO by reaching the optima for this test function within 10 iterations only. Further, it can be concluded from this graph that the GWO and FPA performed well when compared with other algorithms. FA demonstrates poor convergence in most of the optimization process, however it eventually ends the value of PSO. Fig. 4 demonstrates the line graphs of convergence of CGWO along with all other algorithms for G2 constrained benchmark function in which it is easily remarkable that CGWO is fastest of all in context of convergence towards the optima than that of FA, FPA, GWO and PSO. PSO has shown a competitive performance to CGWO for this problem and exhibits significant performance in terms of convergence speed. GWO, FA and FPA show a faster convergence rate initially, however they seem to be trapped into sub-optimal values as the optimization procedure proceeds. Fig. 5 illustrates the convergence rate on G9 test constrained benchmark function in which CGWO is demonstrating the high rate of convergence as compared to FPA, FA, PSO and GWO. The convergence line graph of FPA and PSO is showing slow convergence by giving constant fitness values for many iterations in between the 100 iterations as they seem to be trapped into sub-optimal values as the procedure proceeds, especially FPA. This demonstrates how CGWO is capable of balancing exploration and exploitation to find the global optimum rapidly and effectively. Fig. 6 is presenting the graphical view of convergence of all algorithms on G13 test constrained benchmark function in which it can be clearly seen that CGWO algorithm is nearest to the global optima of this constrained problem among algorithms viz. FPA, FA, GWO and PSO and it also shows fastest convergence of all. GWO and FPA illustrate poorer convergence than the other algorithms in the initial iterations. However, the search process is progressively accelerated during iterations for these algorithms. This indicates that the performance of CGWO can be boosted by the chaotic maps in terms of not only exploration but also exploitation. 5.4. Statistical testing Statistical testing is a process of making quantitative decisions about a problem in which statistical data set is evaluated and taken which is then compared hypothetically (Wilcoxon et al., 1970). The statistical testing of the constrained benchmark functions applied on all algorithms involved in this paper has been done using a widely used non parametric test named Wilcoxon signed rank-test discussed in Section 5.4.1. 5.4.1. Wilcoxon signed rank-test Wilcoxon signed rank-test is a statistical method which is solely based on the order of the sample's observations (Table 5 and their rank summary is provided in Table 6. Results depict that CGWO possessed lowest rank among all other optimization algorithms used in comparison for most of the benchmark functions which proves the superior performance of CGWO among other in comparison. However, PSO and GWO competed with CGWO closely and ranked second and third respectively. The superior performance of CGWO doesn't mean that it is superior than all other optimization algorithms present in the literature which will also lead to the violation of ‘free lunch theorem’ (Ho and Pepyne, 2002). Its performance simply signifies that it is better than other algorithms taken in this work only. Table 5 Pair-wise Wilcoxon signed rank test results. Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Open in new tab Table 5 Pair-wise Wilcoxon signed rank test results. Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Function . Wilcoxon signed rank test order . G1 CGWO < GWO < PSO < FPA < FA G2 CGWO < PSO < FA < GWO < FPA G3 CGWO < GWO < FPA < PSO < FA G4 FA < PSO < CGWO < GWO < FPA G5 GWO < CGWO < FPA < PSO < FA G6 CGWO < FA < FPA < GWO < PSO G7 PSO < FA < FPA < GWO < CGWO G8 PSO < FPA < FA < CGWO < GWO G9 PSO < FA < CGWO < GWO < FPA G10 CGWO < GWO < FA < PSO < FPA G11 GWO < CGWO < PSO < FPA < FA G12 CGWO < GWO < FA < FPA < PSO G13 CGWO < FPA < PSO < FA < GWO Open in new tab Table 6 Rank summary of statistical assessment results. Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Open in new tab Table 6 Rank summary of statistical assessment results. Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Function . CGWO . GWO . FPA . FA . PSO . G1 1 2 4 5 3 G2 1 4 5 3 2 G3 1 2 3 5 4 G4 3 4 5 1 2 G5 2 1 3 5 4 G6 1 4 3 2 5 G7 5 4 3 2 1 G8 4 5 2 3 1 G9 3 4 5 2 1 G10 1 2 5 3 4 G11 2 1 4 5 3 G12 1 2 4 3 5 G13 1 5 2 4 3 Total 26 40 48 43 38 Open in new tab 6. CGWO for classical engineering design problems Engineering design is a process of satisfying the needs involved in building a product. It is a decision making process which consists of a complex objective function and a large number of decision variables such as weight, strength and wear (Askarzadeh, 2016). Meta-heuristic methods come into being as an alternative to the traditional optimization methods. With their merits of finding acceptable solutions in an affordable time and being tolerant of non-convex and non-differentiable, meta-heuristic algorithms have attracted great research interest during the recent years. In real design problems the number of design variables can be very large, and their influence on the objective function to be optimized can be very complicated, with a nonlinear character. Therefore, in this paper, design considerations of five classical engineering design problems viz. spring design problem, gear train design problem, welded beam design problem, pressure vessel design problem and closed coil helical spring design problem have been done in Sections 6.1, 6.2, 6.3, 6.4, 6.5. These problems contain various local optima, whereas only global optimum is required. These problems cannot be handled by traditional methods which focus on local optima only. Hence, there is a need for effective and efficient optimization methods for these engineering design problems. In this section, various experiments on these benchmark problems are implemented to verify the performance of the proposed meta-heuristic CGWO method. In order to get an unbiased comparison of CPU times, all the experiments are performed over 30 independent runs for 500 iterations. 6.1. Tension/Compression spring design problem The main goal of this engineering design problem is to minimize the weight of the spring involving three decision variables which are wire diameter (d) ⁠, mean coil diameter (D) and number of active coils (N) ⁠. This problem is subjected to three inequality constraints and an objective function given in Eq. (8). Considerx→=[x1x2x3]=[dDN],Minimizef(x→)=(x3+2)x2x12,Subject tog1(x→)=1-x23x3717,854x14⩽0,g2(x→)=4x22-x1x212,566(x2x13-x14)+15108x12⩽0,g3(x→)=1-140.45x1x22x3⩽0,g4(x→)=x1+x21.5-1⩽0,(8) Variable range0.05⩽x1⩽2.00,0.25⩽x2⩽1.30,2.00⩽x3⩽15.00 Fig. 7. Fig. 7. Open in new tabDownload slide Structure of tension/spring design (Rao et al., 2011). 6.2. Gear train design problem The goal of this engineering design problem is to minimize the cost of gear ratio of the gear train whose schematic diagram is shown in Fig. 8. This problem has no equality or inequality constraints except a boundary constraint. It consists of four decision variables represented as nA(x1) ⁠, nB(x2) ⁠, nD(x3) ⁠, nF(x4) using which the gear ratio can be formulated as nBnD/nFnA ⁠. Mathematical formulation of the objective function of gear train design problem along with its boundary constraint is given in Eq. (9). Min.f(x)=((1/6.931)-(x3x2/x1x4))2S.t.12⩽xi⩽60(9) Fig. 8. Open in new tabDownload slide Structure of gear train design (Rao et al., 2011). Table 9 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 10 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm with conventional GWO algorithm and with those found by other optimization algorithms. In terms of optimal results, CGWO outperforms GWO (Mirjalili et al., 2014), CSA (Askarzadeh, 2016), UPSO (Parsopoulos and Vrahatis, 2005), ABC (Akay and Karaboga, 2012) and MBA (Sadollah et al., 2013). In terms of mean, CGWO gives better value than those obtained by all other algorithms in comparison. Table 7 Optimal solution of spring design problem by CGWO algorithm. Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Open in new tab Table 7 Optimal solution of spring design problem by CGWO algorithm. Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Tension/Compression spring design problem . Parameter x1 x2 x3 f(x) Value 0.052796 0.804380 2.000000 0.0119598 Open in new tab Table 8 Comparison results of spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 N.A. – Not Available. Open in new tab Table 8 Comparison results of spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 Algorithm . Worst . Mean . Best . Std. . CGWO 0.0121791 0.0121749 0.0119598 1.039E−05 GWO 0.0122515 0.0121836 0.0126660 1.085E−05 CSA 0.0126701 0.0127690 0.0126652 1.357E−06 GA3 0.0128220 0.0127690 0.0127048 3.940E−05 GA4 0.0129730 0.0127420 0.0126810 5.90E−05 CPSO 0.0129240 0.0127330 0.0126747 5.20E−04 HPSO 0.0127190 0.0127072 0.0126652 1.58E−05 G-QPSO 0.0177590 0.0135240 0.0126650 0.001268 QPSO 0.0181270 0.0138540 0.0126690 0.001341 PSO 0.0718020 0.0195550 0.0128570 0.011662 DSS-MDE 0.0127382 0.0126693 0.0126652 1.25E−05 PSO-DE 0.0126653 0.0126652 0.0126652 1.2E−08 SC 0.0167172 0.0129226 0.0126692 5.9E−04 UPSO N.A. 0.0229400 0.0131200 7.2E−03 (μ+λ)-ES N.A. 0.0131650 0.0126890 3.9E−04 ABC N.A. 0.0127090 0.0126650 0.01281 TLBO N.A. 0.0126657 0.0126650 N.A. MBA 0.0129000 0.0127130 0.0126650 6.3E−05 N.A. – Not Available. Open in new tab Table 9 Optimal solution of gear train design problem by CGWO algorithm. Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Open in new tab Table 9 Optimal solution of gear train design problem by CGWO algorithm. Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Gear train design problem . Parameter x1 x2 x3 x4 f(x) Value 45.1903 21.2025 14.6466 50.2213 2.833970E−13 Open in new tab Table 10 Comparison results of gear train design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 N.A. – Not Available. Open in new tab Table 10 Comparison results of gear train design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 Algorithm . Worst . Mean . Best . Std. . CGWO 2.71358E−10 7.09107E−11 2.833970E−13 1.02462E−10 GWO 5.03136E−09 1.62918E−09 1.568642E−11 1.76011E−09 CSA 3.18473E−08 2.05932E−09 2.700857E−12 5.059779E−9 UPSO N.A. 3.80562E−08 2.700857E−12 1.09000E−09 ABC N.A. 3.64133E−10 2.700857E−12 5.52000E−09 MBA 2.06290E−08 2.47163E−09 2.700857E−12 3.94000E−09 N.A. – Not Available. Open in new tab 6.3. Welded beam design problem Welded beam design problem which is a minimization problem has four variables namely weld thickness (h) ⁠, length of bar attached to the weld (l) ⁠, bar's height (t) ⁠, bar's thickness (b) as shown in Fig. 9. The constraints included in this problem are bending stress (⁠ θ ⁠), beam deflection (⁠ δ ⁠), shear stress (⁠ τ ⁠), buckling load (⁠ Pc ⁠) and other side constraints. The mathematical formulas related to this problem are represented in Eq. (10). Considerx→=[x1x2x3x4]=[hltb],Minimizef(x→)=1.10471x12x2+0.04811x3x4(14.0+x2),Subject tog1(x→)=τ(x→)-τmax⩽0,g2(x→)=σ(x→)-σmax⩽0,g3(x→)=δ(x→)-δmax⩽0,g4(x→)=x1-x4⩽0,g5(x→)=P-Pc(x→)⩽0,g6(x→)=0.125-x1⩽0,g7(x→)=0.10471x12+0.04811x3x4(14.0+x2)-5.0⩽0(10) Variable range0.1⩽x1⩽2,0.1⩽x2⩽10,0.1⩽x3⩽10,0.1⩽x4⩽2 where τ(x→)=τ′2+2τ′τ″x22R+τ2″ τ′=P2x1x2 τ″=MRJ M=P(L+x22) R=x224+x1+x322 J=22x1x2x224x1+x322 σ(x→)=6PLx4x32 δ(x→)=4PL3Ex32x+x4 Pc(x→)=4.013Ex32x4636L21-x32LE4G P=6000lb,L=14in.,δmax=0.25in.,E=30×106psi, G=12×106psi,τmax=13,600psi,σmax=30,000psi Fig. 9. Open in new tabDownload slide Structure of welded beam design (Rao et al., 2011). Table 11 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 12 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, conventional GWO algorithm and with those found by other optimization algorithms. In terms of best result, CGWO outperforms GWO (Mirjalili et al., 2014), GA3 (Coello, 2000), GA4 (Coello and Montes, 2002), CPSO (He and Wang, 2007), SC (Ray and Liew, 2003), UPSO (Parsopoulos and Vrahatis, 2005) and CDE (He and Wang, 2007). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 11 Optimal solution of welded beam design problem by CGWO algorithm. Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Open in new tab Table 11 Optimal solution of welded beam design problem by CGWO algorithm. Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Welded beam design problem . Parameter x1 x2 x3 x4 f(x) Value 0.343891 1.883570 9.03133 0.212121 1.72545 Open in new tab Table 12 Comparison results of welded beam design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 N.A. – Not Available. Open in new tab Table 12 Comparison results of welded beam design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 Algorithm . Worst . Mean . Best . Std. . CGWO 2.435700 2.428900 1.725450 1.35780 CPSO 1.782143 1.748831 1.728024 1.29E−2 GA4 1.993408 1.792654 1.728226 7.47E−2 GA3 1.785835 1.771973 1.748309 1.12E−2 CDE N.A. 1.768150 1.733460 N.A. UPSO N.A. 2.837210 1.921990 0.68300 GWO 2.913600 2.859400 1.942100 2.69080 SC 6.399678 3.002588 2.385434 9.60E−1 N.A. – Not Available. Open in new tab 6.4. Pressure vessel design problem Pressure vessel design problem is a classical engineering design problem whose main goal is to minimize the welding, manufacturing and material cost of the pressure vessel. There are a total of four decision variables involved in this problem which are thickness of shell (Ts) ⁠, thickness of head (Th) which are discrete decision variables, inner radius (R) and length of cylindrical section of the vessel (L) which are continuous decision variables. The diagrammatical representation of pressure vessel design problem is given in Fig. 10 showing variables of pressure vessel. The mathematical equations of the nonlinear objective function and constraints is represented in Eq. (11). The mentioned problem has four inequality constraints. Considerx→=[x1x2x3x4]=[TsThRL],Minimizef(x→)=0.6224x1x3x4+1.7781x2x32+3.1661x12x4+19.84x12x3,Subject tog1(x→)=-x1+0.0193x3⩽0,g2(x→)=-x2+0.00954x3⩽0,g3(x→)=-πx32x4-43πx33+1,296,000⩽0,g4(x→)=x4-240⩽0Variable range0⩽x1⩽100,0⩽x2⩽100,10⩽x3⩽200,10⩽x4⩽200(11) Fig. 10. Open in new tabDownload slide Structure of pressure vessel design problem (Rao et al., 2011). Table 13 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 14 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, conventional GWO algorithm and with those found by other optimization algorithms. It can be said from results that CGWO outperforms GWO (Mirjalili et al., 2014), (μ+λ)-ES (Mezura-Montes and Coello, 2005), CSA (Askarzadeh, 2016),HPSO (He and Wang, 2007), PSO-DE (Liu et al., 2010), ABC (Akay and Karaboga, 2012), TLBO (Rao et al., 2011), G-QPSO (dos Santos Coelho, 2010), QPSO (dos Santos Coelho, 2010), CDE (He and Wang, 2007), GA4 (Coello and Montes, 2002), CPSO (He and Wang, 2007), UPSO (Parsopoulos and Vrahatis, 2005), GA3 (Coello, 2000) and PSO (dos Santos Coelho, 2010). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 13 Optimal solution of pressure vessel design problem by CGWO algorithm. Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Open in new tab Table 13 Optimal solution of pressure vessel design problem by CGWO algorithm. Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Pressure vessel design problem . Parameter x1 x2 x3 x4 f(x) Value 1.187150 0.600000 69.707500 7.7984400 5034.1800 Open in new tab Table 14 Comparison results of pressure vessel design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 N.A. – Not Available. Open in new tab Table 14 Comparison results of pressure vessel design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 Algorithm . Worst . Mean . Best . Std. . CGWO 6188.110 5783.582 5034.180 254.505 GWO 6395.360 6159.320 6051.563 379.674 (⁠ μ+λ ⁠) − ES N.A. 6379.938 6059.701 210.000 CSA 7332.841 6342.499 6059.714 384.945 HPSO 6288.677 6099.932 6059.714 86.200 PSO-DE N.A. 6059.714 6059.714 N.A. ABC N.A. 6245.308 6059.714 205.000 TLBO N.A. 6059.714 6059.714 N.A. G-QPSO 7544.492 6440.378 6059.720 448.471 QPSO 8017.281 6440.378 6059.720 479.267 CDE 6371.045 6085.230 6059.734 43.0130 GA4 6469.322 6177.253 6059.946 130.929 CPSO 6363.804 6147.133 6061.077 86.4500 UPSO 9387.770 8016.370 6154.700 745.869 GA3 6308.497 6293.843 6288.744 7.41330 PSO 14076.324 8756.680 6693.721 1492.56 N.A. – Not Available. Open in new tab 6.5. Closed coil helical spring design problem The main goal of this mechanical engineering design constrained problem is to minimize the volume of closed coil helical spring. Helical spring is made up of closed coiled wire having the shape of a helix and is intended for tensile and compressive load (Fig. 11, it can be seen that the coils of spring are so closed that the plane is at nearly right angles to the helix's axis and coil is subjected to torsion. This problem has a total of two decision variables namely coil diameter (D) and wire diameter (d) whose range is given in Eq. (13). The volume of the helical spring (U) can be minimized using the objective function given in Eq. (12). U=π24(Nc+2)Dd2(12) where 0.508⩽d⩽1.016,1.270⩽D⩽7.620,15⩽Nc⩽25(13) Fig. 11. Open in new tabDownload slide Structure of closed coil helical spring design problem (Savsani et al., 2010). The constrained problem of helical spring is subjected to eight constraints out of which first is stress constraint represented in Eq. (14) and second is configuration constraint given in Eq. (15). S-8CfFmaxDπd3⩾0,Cf=4C-14C-4+0.615C,C=Dd(14) Here, Fmax which is the maximum load and S, the shear stress allowed on the spring are set to 453.6 kg and 13288.02kgf/cm2 respectively. K=Gd48NcD3(15) where G is set to 808543.6kgf/cm2 and K is the spring constant. Next constraint is the length constraint expressed as given in Eq. (18) in which the maximum length lmax is equal to 35.56 cm. lf is the free length which can be calculated using Eq. (17). The deflection (δl) made in the spring due to maximum work load is also involved in calculating free length as given in Eq. (16). δl=FmaxK(16) lf=δl+1.05(Nc+2)d(17) lmax-lf⩾0(18) The wire diameter should also follow the constraint represented in Eq. (19) where dmin is set to 0.508 cm. d-dmin⩾0(19) The outer diameter of the coil (D) spring should also be less the maximum diameter specified (Dmax) which is 7.62 cm. Mathematically, it is expressed in Eq. (20). Dmax-(D+d)⩾0(20) The mean coil diameter (C) must also be at least three times the diameter of the wire as represented mathematically in Eq. (21). C-3⩾0(21) Next, the deflection occurs under the preload δp must also be less than its specified value δpm which is 15.24 cm as represented in Eq. (22). The preload deflection can be calculated using Eq. (23). δpm-δp⩾0(22) δp=FpK(23) Here, Fp is set to 136.08 kg. The combined deflection constraint is given in Eq. (24) which makes the deflection of the coil consistent with its length. lf-δp=Fmax-FpK-1.05(Nc+2)d⩾0(24) The next and the last constraint is subjected to the preload deflection of the spring which defines that it must be equal to its specified value (δω) which is equal to 3.175 cm. It is expressed in Eq. (25). Fmax-FpK-δω⩽0(25) Table 15 is showing the most optimal solution and the optimal values of decision variables found by CGWO algorithm. Table 16 is showing the comparison of all the simulation results for this problem applied on CGWO algorithm, GWO algorithm and with those found by other optimization algorithms. It can be said from results that CGWO outperforms GWO (Mirjalili et al., 2014), DTLBO (Thamaraikannan and Thirunavukkarasu, 2014), TLBO (Rao et al., 2011), Conventional method (Hinze, 2005), PSO (He et al., 2004), ABS (Thamaraikannan and Thirunavukkarasu, 2014) and GA (Das and Pratihar, 2002). Also, the mean obtained by CGWO for this problem is better than those obtained by all other algorithms in comparison. Table 15 Optimal solution of closed coil helical spring design problem by CGWO algorithm. Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Open in new tab Table 15 Optimal solution of closed coil helical spring design problem by CGWO algorithm. Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Closed coil helical spring design problem . Parameter d D f(x) Value 0.599394 1.92367 42.0990 Open in new tab Table 16 Comparison results of closed coil helical spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. N.A. – Not Available. Open in new tab Table 16 Comparison results of closed coil helical spring design problem. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. Algorithm . Worst . Mean . Best . Std. . CGWO 42.9625 41.9815 42.0990 2.7502 GWO 44.5842 43.6468 43.6524 1.7684 DTLBO 46.4322 46.3192 46.3012 N.A. TLBO 46.5214 46.4998 46.3221 N.A. Conventional N.A. N.A. 46.4392 N.A. PSO 46.6752 46.6254 46.5212 N.A. ABS 46.6241 46.6033 46.5115 N.A. GA 46.3932 46.6821 46.6653 N.A. N.A. – Not Available. Open in new tab 7. Conclusion and future scope The chaos theory and Grey Wolf Optimizer (GWO) are hybridized in order to design an improved meta-heuristic Chaotic Grey Wolf Optimization (CGWO) algorithm for constrained optimization problems. Various chaotic maps are used to regulate the key parameter, a, of GWO. The proposed CGWO is validated on thirteen constrained benchmark functions and five constrained engineering design problems. The chebyshev map is selected as its a through comparing various chaotic GWO variants to form the best CGWO. The simulations showed that the usage of deterministic chaotic signals instead of linearly decreasing values is an important modification of the GWO algorithm. Statistical results and success rates of the CGWO suggest that the tuned GWO clearly improves the reliability of the global optimality and they also enhanced the quality of the results. In comparison with other algorithms viz. FA, FPA, GWO and PSO, it seems the CGWO performed significantly well. The results of CGWO on constrained engineering problems showed its applicability for the complex real-world problems. The main reason of the superior performance of CGWO lies behind the chaos induced by the chaotic maps in the search space. This chaos helps the controlling parameter to find the optimal solution more quickly and thus refine the convergence rate of the algorithm. So, it can be easily concluded here that proposed CGWO can handle constrained problems effectively and efficiently. Further investigation on convergence analysis may prove fruitful. In addition, further topics of studies can also focus on the extension of the CGWO to solve mixed-type problems and discrete optimization problems. References Akay , B. , & Karaboga , D. ( 2012 ). Artificial bee colony algorithm for large-scale problems and engineering design optimization . Journal of Intelligent Manufacturing , 23 ( 4 ), 1001 – 1014 . Google Scholar Crossref Search ADS WorldCat Alavi , A. H. , & Gandomi , A. H. ( 2011 ). Prediction of principal ground-motion parameters using a hybrid method coupling artificial neural networks and simulated annealing . Computers & Structures , 89 ( 23 ), 2176 – 2194 . Google Scholar Crossref Search ADS WorldCat Alavi , A. H. , Gandomi , A. H., Sahab , M. G., & Gandomi , M. ( 2010 ). Multi expression programming: A new approach to formulation of soil classification . Engineering with Computers , 26 ( 2 ), 111 – 118 . Google Scholar Crossref Search ADS WorldCat Arora , S. , & Singh , S. ( 2013 ). A conceptual comparison of firefly algorithm, bat algorithm and cuckoo search. In International conference on control computing communication & materials (ICCCCM), 2013 (pp. 1 – 4 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Arora , S. , & Singh , S. ( 2015 ). Butterfly algorithm with levy flights for global optimization. In 2015 International conference on signal processing, computing and control (ISPCC) (pp. 220 – 224 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Arora , S. , & Singh , S. ( 2017 ). An improved butterfly optimization algorithm with chaos . Journal of Intelligent & Fuzzy Systems , 32 ( 1 ), 1079 – 1088 . Google Scholar Crossref Search ADS WorldCat Askarzadeh , A. ( 2016 ). A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm . Computers & Structures , 169 , 1 – 12 . Google Scholar Crossref Search ADS WorldCat Cagnina , L. C. , Esquivel , S. C., & Coello , C. A. C. ( 2008 ). Solving engineering optimization problems with the simple constrained particle swarm optimizer . Informatica , 32 ( 3 ). Coello , C. A. C. ( 2000 ). Use of a self-adaptive penalty approach for engineering optimization problems . Computers in Industry , 41 ( 2 ), 113 – 127 . Google Scholar Crossref Search ADS WorldCat Coello , C. A. C. , & Montes , E. M. ( 2002 ). Constraint-handling in genetic algorithms through the use of dominance-based tournament selection . Advanced Engineering Informatics , 16 ( 3 ), 193 – 203 . Google Scholar Crossref Search ADS WorldCat Das , A. , & Pratihar , D. ( 2002 ). Optimal design of machine elements using a genetic algorithm . Journal of the Institution of Engineers (India), Part MC, Mechanical Engineering Division , 83 ( 3 ), 97 – 104 . Google Scholar OpenURL Placeholder Text WorldCat Deb , K. ( 2000 ). An efficient constraint handling method for genetic algorithms . Computer Methods in Applied Mechanics and Engineering , 186 ( 2 ), 311 – 338 . Google Scholar Crossref Search ADS WorldCat dos Santos Coelho , L. ( 2010 ). Gaussian quantum-behaved particle swarm optimization approaches for constrained engineering design problems . Expert Systems with Applications , 37 ( 2 ), 1676 – 1683 . Google Scholar Crossref Search ADS WorldCat Emary , E. , Zawbaa , H. M., & Hassanien , A. E. ( 2016 ). Binary grey wolf optimization approaches for feature selection . Neurocomputing , 172 , 371 – 381 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , & Yang , X. -S. ( 2014 ). Chaotic bat algorithm . Journal of Computational Science , 5 ( 2 ), 224 – 232 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. , Yang , X. -S., Talatahari , S., & Alavi , A. ( 2013 ). Firefly algorithm with chaos . Communications in Nonlinear Science and Numerical Simulation , 18 ( 1 ), 89 – 98 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , Yun , G. J., Yang , X. -S., & Talatahari , S. ( 2013 ). Chaos-enhanced accelerated particle swarm optimization . Communications in Nonlinear Science and Numerical Simulation , 18 ( 2 ), 327 – 340 . Google Scholar Crossref Search ADS WorldCat Gandomi , A. H. , Yang , X. -S., Alavi , A. H., & Talatahari , S. ( 2013 ). Bat algorithm for constrained optimization tasks . Neural Computing and Applications , 22 ( 6 ), 1239 – 1255 . Google Scholar Crossref Search ADS WorldCat Gao , X. -Z. , Jokinen , T., Wang , X., Ovaska , S. J., & Arkkio , A. ( 2010 ). A new harmony search method in optimal wind generator design. In 2010 XIX international conference on electrical machines (ICEM) (pp. 1 – 6 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Gao , X. -Z. , Wang , X., Ovaska , S. J., & Xu , H. ( 2010 ). A modified harmony search method in constrained optimization . International Journal of Innovative Computing, Information and Control , 6 ( 9 ), 4235 – 4247 . Google Scholar OpenURL Placeholder Text WorldCat Han , X. , & Chang , X. ( 2012 ). A chaotic digital secure communication based on a modified gravitational search algorithm filter . Information Sciences , 208 , 14 – 27 . Google Scholar Crossref Search ADS WorldCat Han , X. , & Chang , X. ( 2013 ). An intelligent noise reduction method for chaotic signals based on genetic algorithms and lifting wavelet transforms . Information Sciences , 218 , 103 – 118 . Google Scholar Crossref Search ADS WorldCat He , Q. , & Wang , L. ( 2007 ). An effective co-evolutionary particle swarm optimization for constrained engineering design problems . Engineering Applications of Artificial Intelligence , 20 ( 1 ), 89 – 99 . Google Scholar Crossref Search ADS WorldCat He , Q. , & Wang , L. ( 2007 ). A hybrid particle swarm optimization with a feasibility-based rule for constrained optimization . Applied Mathematics and Computation , 186 ( 2 ), 1407 – 1422 . Google Scholar Crossref Search ADS WorldCat He , D. , He , C., Jiang , L. -G., Zhu , H. -W., & Hu , G. -R. ( 2001 ). Chaotic characteristics of a one-dimensional iterative map with infinite collapses . IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications , 48 ( 7 ), 900 – 906 . Google Scholar Crossref Search ADS WorldCat He , S. , Prempain , E., & Wu , Q. ( 2004 ). An improved particle swarm optimizer for mechanical design optimization problems . Engineering Optimization , 36 ( 5 ), 585 – 605 . Google Scholar Crossref Search ADS WorldCat Herskovits , J. ( 1986 ). A two-stage feasible directions algorithm for nonlinear constrained optimization . Mathematical Programming , 36 ( 1 ), 19 – 38 . Google Scholar Crossref Search ADS WorldCat Hinze , M. ( 2005 ). A variational discretization concept in control constrained optimization: The linear-quadratic case . Computational Optimization and Applications , 30 ( 1 ), 45 – 61 . Google Scholar Crossref Search ADS WorldCat Ho , Y. -C. , & Pepyne , D. L. ( 2002 ). Simple explanation of the no free lunch theorem of optimization . Cybernetics and Systems Analysis , 38 ( 2 ), 292 – 298 . Google Scholar Crossref Search ADS WorldCat Homaifar , A. , Qi , C. X., & Lai , S. H. ( 1994 ). Constrained optimization via genetic algorithms . Simulation , 62 ( 4 ), 242 – 253 . Google Scholar Crossref Search ADS WorldCat Jia , D. , Zheng , G., & Khan , M. K. ( 2011 ). An effective memetic differential evolution algorithm based on chaotic local search . Information Sciences , 181 ( 15 ), 3175 – 3187 . Google Scholar Crossref Search ADS WorldCat Joines , J. A. , & Houck , C. R. ( 1994 ). On the use of non-stationary penalty functions to solve nonlinear constrained optimization problems with GA's. In Proceedings of the first IEEE conference on evolutionary computation, 1994. IEEE world congress on computational intelligence . (pp. 579 – 584 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Karaboga , D. , & Akay , B. ( 2011 ). A modified artificial bee colony (abc) algorithm for constrained optimization problems . Applied Soft Computing , 11 ( 3 ), 3021 – 3031 . Google Scholar Crossref Search ADS WorldCat Karaboga , D. , & Basturk , B. ( 2007 ). Artificial bee colony (abc) optimization algorithm for solving constrained optimization problems. In Foundations of fuzzy logic and soft computing (pp. 789 – 798 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Kennedy , J. ( 2011 ). Particle swarm optimization. In Encyclopedia of machine learning (pp. 760 – 766 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Komaki , G. M. , & Kayvanfar , V. ( 2015 ). Grey wolf optimizer algorithm for the two-stage assembly flow shop scheduling problem with release time . Journal of Computational Science , 8 , 109 – 120 . Google Scholar Crossref Search ADS WorldCat Lee , K. S. , & Geem , Z. W. ( 2004 ). A new structural optimization method based on the harmony search algorithm . Computers & Structures , 82 ( 9 ), 781 – 798 . Google Scholar Crossref Search ADS WorldCat Liu , H. , Cai , Z., & Wang , Y. ( 2010 ). Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization . Applied Soft Computing , 10 ( 2 ), 629 – 640 . Google Scholar Crossref Search ADS WorldCat Luo , Q. , Ma , M., & Zhou , Y. ( 2016 ). A novel animal migration algorithm for global numerical optimization . Computer Science and Information Systems , 13 ( 1 ), 259 – 285 . Google Scholar Crossref Search ADS WorldCat Madadi , A. , & Motlagh , M. M. ( 2014 ). Optimal control of dc motor using grey wolf optimizer algorithm . TJEAS Journal , 4 ( 4 ), 373 – 379 . Google Scholar OpenURL Placeholder Text WorldCat Meng , X. -B. , Gao , X., Lu , L., Liu , Y., & Zhang , H. ( 2015 ). A new bio-inspired optimisation algorithm: Bird swarm algorithm . Journal of Experimental & Theoretical Artificial Intelligence , 1 – 15 . Meng , Z. , Pan , J. -S., & Alelaiwi , A. ( 2016 ). A new meta-heuristic ebb-tide-fish-inspired algorithm for traffic navigation . Telecommunication Systems , 62 ( 2 ), 403 – 415 . Google Scholar Crossref Search ADS WorldCat Mezura-Montes , E. , & Coello , C. A. C. ( 2005 ). A simple multimembered evolution strategy to solve constrained optimization problems . IEEE Transactions on Evolutionary Computation , 9 ( 1 ), 1 – 17 . Google Scholar Crossref Search ADS WorldCat Mezura-Montes , E. , & Coello , C. A. C. ( 2005 ). Useful infeasible solutions in engineering optimization with evolutionary algorithms. In MICAI 2005: Advances in artificial intelligence (pp. 652 – 662 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Mirjalili , S. , & Lewis , A. ( 2016 ). The whale optimization algorithm . Advances in Engineering Software , 95 , 51 – 67 . Google Scholar Crossref Search ADS WorldCat Mirjalili , S. , Mirjalili , S. M., & Lewis , A. ( 2014 ). Grey wolf optimizer . Advances in Engineering Software , 69 , 46 – 61 . Google Scholar Crossref Search ADS WorldCat Mirjalili , S. , Mirjalili , S. M., & Hatamlou , A. ( 2016 ). Multi-verse optimizer: A nature-inspired algorithm for global optimization . Neural Computing and Applications , 27 ( 2 ), 495 – 513 . Google Scholar Crossref Search ADS WorldCat Mohamed , A. -A. A. , El-Gaafary , A. A., Mohamed , Y. S., & Hemeida , A. M. ( 2015 ). Energy management with capacitor placement for economics low carbon emissions using modified multi-objective grey wolf optimizer . Muangkote , N. , Sunat , K., & Chiewchanwattana , S. ( 2014 ). An improved grey wolf optimizer for training q-Gaussian radial basis functional-link nets. In 2014 International computer science and engineering conference (ICSEC) (pp. 209 – 214 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Parsopoulos , K. E. , & Vrahatis , M. N. ( 2002 ). Particle swarm optimization method for constrained optimization problems . Intelligent Technologies–Theory and Application: New Trends in Intelligent Technologies , 76 , 214 – 220 . Google Scholar OpenURL Placeholder Text WorldCat Parsopoulos , K. E. , & Vrahatis , M. N. ( 2005 ). Unified particle swarm optimization for solving constrained engineering optimization problems. In Advances in natural computation (pp. 582 – 591 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Pecora , L. M. , & Carroll , T. L. ( 1990 ). Synchronization in chaotic systems . Physical Review Letters , 64 ( 8 ), 821 . Google Scholar Crossref Search ADS PubMed WorldCat Powell , M. J. ( 1978 ). A fast algorithm for nonlinearly constrained optimization calculations. In Numerical analysis (pp. 144 – 157 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Rao , R. ( 2016 ). Jaya: A simple and new optimization algorithm for solving constrained and unconstrained optimization problems . International Journal of Industrial Engineering Computations , 7 ( 1 ), 19 – 34 . Google Scholar OpenURL Placeholder Text WorldCat Rao , R. V. , Savsani , V. J., & Vakharia , D. ( 2011 ). Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems . Computer-Aided Design , 43 ( 3 ), 303 – 315 . Google Scholar Crossref Search ADS WorldCat Ray , T. , & Liew , K. M. ( 2003 ). Society and civilization: An optimization algorithm based on the simulation of social behavior . IEEE Transactions on Evolutionary Computation , 7 ( 4 ), 386 – 396 . Google Scholar Crossref Search ADS WorldCat Sadollah , A. , Bahreininejad , A., Eskandar , H., & Hamdi , M. ( 2013 ). Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems . Applied Soft Computing , 13 ( 5 ), 2592 – 2612 . Google Scholar Crossref Search ADS WorldCat Saremi , S. , Mirjalili , S., & Lewis , A. ( 2014 ). Biogeography-based optimisation with chaos . Neural Computing and Applications , 25 ( 5 ), 1077 – 1097 . Google Scholar Crossref Search ADS WorldCat Savsani , V. , Rao , R., & Vakharia , D. ( 2010 ). Optimal weight design of a gear train using particle swarm optimization and simulated annealing algorithms . Mechanism and Machine Theory , 45 ( 3 ), 531 – 541 . Google Scholar Crossref Search ADS WorldCat Shi , Y. ( 2015 ). An optimization algorithm based on brainstorming process . Emerging Research on Swarm Intelligence and Algorithm Optimization , 1 – 35 . Shi , Y. , & Eberhart , R. ( 1998 ). A modified particle swarm optimizer. In The 1998 IEEE international conference on evolutionary computation proceedings, 1998. IEEE world congress on computational intelligence (pp. 69 – 73 ). IEEE . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC Talatahari , S. , Azar , B. F., Sheikholeslami , R., & Gandomi , A. ( 2012 ). Imperialist competitive algorithm combined with chaos for global optimization . Communications in Nonlinear Science and Numerical Simulation , 17 ( 3 ), 1312 – 1319 . Google Scholar Crossref Search ADS WorldCat Thamaraikannan , B. , & Thirunavukkarasu , V. ( 2014 ). Design optimization of mechanical components using an enhanced teaching-learning based optimization algorithm with differential operator . Mathematical Problems in Engineering , 2014. 10 pages. Wang , G. -G. , Guo , L., Gandomi , A. H., Hao , G. -S., & Wang , H. ( 2014 ). Chaotic krill herd algorithm . Information Sciences , 274 , 17 – 34 . Google Scholar Crossref Search ADS WorldCat Wang , G. -G. , Guo , L., Duan , H., & Wang , H. ( 2014 ). A new improved firefly algorithm for global numerical optimization . Journal of Computational and Theoretical Nanoscience , 11 ( 2 ), 477 – 485 . Google Scholar Crossref Search ADS WorldCat Wilcoxon , F. , Katti , S., & Wilcox , R. A. ( 1970 ). Critical values and probability levels for the Wilcoxon rank sum test and the Wilcoxon signed rank test . Selected Tables in Mathematical Statistics , 1 , 171 – 259 . Google Scholar OpenURL Placeholder Text WorldCat Yang , X. -S. ( 2012 ). Flower pollination algorithm for global optimization. In Unconventional computation and natural computation (pp. 240 – 249 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Yang , X. -S. , & algorithm , Firefly ( 2010 ). Levy flights and global optimization. In Research and development in intelligent systems XXVI (pp. 209 – 218 ). Springer . Google Scholar Crossref Search ADS Google Preview WorldCat COPAC Yang , D. , Li , G., & Cheng , G. ( 2007 ). On the efficiency of chaos optimization algorithms for global optimization . Chaos, Solitons & Fractals , 34 ( 4 ), 1366 – 1375 . Google Scholar Crossref Search ADS WorldCat Yang , X. -S. , Gandomi , A. H., Talatahari , S., & Alavi , A. H. ( 2012 ). Metaheuristics in water, geotechnical and transport engineering . Newnes . Footnotes Peer review under responsibility of Society for Computational Design and Engineering. Society for Computational Design and Engineering This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Society for Computational Design and Engineering

Journal

Journal of Computational Design and EngineeringOxford University Press

Published: Oct 1, 2018

There are no references for this article.