为了正常的体验网站,请在浏览器设置里面开启Javascript功能!

全局数值寻优的一种混合遗传算法

2011-12-15 6页 pdf 344KB 31阅读

用户头像

is_240316

暂无简介

举报
全局数值寻优的一种混合遗传算法 Brief Paper ACTA AUTOMATICA SINICA Vol. 33, No. 1 January, 2007 Hybrid Simplex-improved Genetic Algorithm for Global Numerical Optimization REN Zi-Wu1 SAN Ye1 CHEN Jun-Feng2 Abstract In this paper, a hybrid simplex-improved genetic algorithm (HSIGA) which combi...
全局数值寻优的一种混合遗传算法
Brief Paper ACTA AUTOMATICA SINICA Vol. 33, No. 1 January, 2007 Hybrid Simplex-improved Genetic Algorithm for Global Numerical Optimization REN Zi-Wu1 SAN Ye1 CHEN Jun-Feng2 Abstract In this paper, a hybrid simplex-improved genetic algorithm (HSIGA) which combines simplex method (SM) and genetic algorithm (GA) is proposed to solve global numerical optimization problems. In this hybrid algorithm some improved genetic mechanisms, for example, non-linear ranking selection, competition and selection among several crossover offspring, adaptive change of mutation scaling and stage evolution, are adopted; and new population is produced through three ap- proaches, i.e. elitist strategy, modified simplex strategy and improved genetic algorithm (IGA) strategy. Numerical experi- ments are included to demonstrate effectiveness of the proposed algorithm. Key words Genetic algorithm, simplex method, competition and selection,mutation scaling 1 Introduction Genetic algorithm (GA) is a stochastic and parallel search technique based on the mechanics of natural selec- tion, genetics and evolution, which was first developed by Holland in 1970s[1]. In recent years, GA has been widely applied to different areas such as fuzzy systems, neural net- works, etc.[2] Although GA has become one of the popu- lar methods to address some global optimization problems, the major problem of GA is that it may be trapped in the local optima of the objective function when the prob- lem dimension is high and there are numerous local op- tima. This degradation in efficiency is apparent especially in applications where the parameters being optimized are highly correlated[3]. In order to overcome these flaws and improve the GA′s optimization efficiency, recent research works have been generally focused on two aspects. One is improvements upon the mechanism of the algorithm, such as modification of genetic operators, or the use of niche technique[4], etc; the other is combination of GA with other optimization methods, such as BFGS methods[5], simulated annealing (SA), etc. In this paper, a hybrid simplex-improved genetic algo- rithm (HSIGA) is proposed to solve global numerical op- timization problems. In this hybrid algorithm some im- proved genetic mechanisms are adopted, such as non-linear ranking selection, competition and selection among several crossover offspring, adaptive change of mutation scaling and adaptive stage evolution mechanism, to form an im- Received June 17, 2005; in revised form April 26, 2006 Supported by National Natural Science Foundation of P.R.China (60474069) 1. Control & Simulation centre, Harbin Institute of Technology, Harbin 150080, P.R.China 2. College of Computer & Information Engineering, Hohai University, Changzhou 213022, P.R. China DOI: 10.1360/aas-007-0091 proved genetic algorithm (IGA). For further performance enhancement, the IGA algorithm is combined with the simplex method (SM) and the new population is gener- ated through three approaches, i.e. elitist strategy, simplex strategy and IGA strategy. We investigate the effectiveness of this proposed algorithm by solving 10 test functions with high dimensions. 2 Hybrid simplex-improved genetic algo- rithm (HSIGA) for numerical optimiza- tion In this paper, the following minimization problem with fixed boundaries is considered minimize f(x ) = f(x1, x2, · · · , xn) subject to xmini 6 xi 6 xmaxi (i = 1, 2, · · · , n) (1) where x = (x1, x2, · · · , xn) is a variable vector in RRRn, f(x ) denotes the objective function, and xmini and x max i repre- sent the lower and the upper bounds of xi such that the meaningful range of xi is [x min i , x max i ]. 2.1 Improved genetic algorithm (IGA) For the minimization problem like (1), we adopt real- code GA and firstly introduce IGA. 1) Non-linear ranking selection operator In order to select some excellent chromosomes from the parent generation, non-linear ranking selection is adopted in this paper, which maps chromosome′s serial number in the queue to an expected selection probability. With the population pop = {x 1, x 2, · · · x i, · · · , xP } of P chromosomes, we distribute the probability to each chromo- some from the best to the worst by a non-linear function, so the selection probability of chromosome x i is8<: p(xi) = q ′(1− q)i−1 q′ = q 1− (1− q)P (2) where q is the selection probability of the best chromosome, i is the serial number of the chromosome. After the selection probability of each chromosome is de- termined, the roulette wheel selection is adopted to select the excellent chromosome. Ranking selection need neither use the individual′s fitness nor transform the fitness scaling, which can prevent the premature convergence or stagnation phenomenon to a certain extent. 2) Competition and selection among several crossover offspring In the natural evolution, parents often reproduce more than two offspring after crossover operation, and com- petition phenomenon between offspring of the same par- ents also always occurs. Illumined by this idea, competi- tion and selection of the excellent among the several off- spring is employed in the crossover operator. Different from the crossover operation of the simple genetic algo- rithm (SGA), four chromosomes are created firstly from the parents xs = [x s 1, x s 2, · · · , xsn] and xt = [xt1, xt2, · · · , xtn] c© 2007 by Acta Automatica Sinica. All rights reserved. 92 ACTA AUTOMATICA SINICA Vol. 33 (s 6= t) according to the following formulae[2]. y1 = [y 1 1 , y 1 2 , · · · , y1n] = xs + xt 2 (3) y2 = [y 2 1 , y 2 2 , · · · , y2n] = xmax(1− ω) + max(xs, xt)ω (4) y3 = [y 3 1 , y 3 2 , · · · , y3n] = xmin(1− ω) + min(xs, xt)ω (5) y4 = [y 4 1 , y 4 2 , · · · , y4n] = (xmax + xmin)(1− ω) + (xs + xt)ω 2 (6) xmax = [x max 1 , x max 2 , · · · , xmaxn ] (7) xmin = [x min 1 , x min 2 , · · · , xminn ] (8) where ω ∈ [0, 1] is a weight, max(x s, x t)is the vector with each element obtained by taking the maximum among the corresponding element of x s and x t. For example, max([1 − 2 3], [2 3 1]) = [2 3 3]. Similarly, min(x s, x t) denotes a vector obtained by taking the minimum value. Among these 4 chromosomes, the two with superior fitness value are taken as the offspring of the crossover operation. It can be seen that the potential offspring can spread over the domain, and this crossover operator is superior to the single arithmetic crossover or heuristic crossover. 3) Adaptive change of mutation scaling Different from the uniform mutation, boundary muta- tion, etc, a mutation operator with adaptive change of mu- tation scaling is employed. According to “the punctuated equilibrium theory”[6] in the evolution field, species evo- lution always appears in many and different directions at previous stage while the evolution tends to be conservative at later stage. Therefore, a larger mutation range is em- ployed during previous stage to keep the population diver- sified while the mutation range will be shrunken gradually during later stage to focus on local search. Supposing that the mutation scaling is µ (0 6 µ 6 1), the element xk (xk ∈ [xmink , xmaxk ]) selected in the chro- mosome (x1, x2, · · · , xk, · · · , xn) is to be mutated with a certain mutation probability Pm. Then this original value xk will be replaced by the mutated value x new k chosen from the range xnewk ∈ n max(xk − µx max k −xmink 2 , xmink ) , min(xk + µ xmaxk −xmink 2 , xmaxk ) o (9) with a uniform probability.Based on the concept that the mutation scaling µ is decreasing gradually during the pro- cess, a monotonously decreasing function of the mutation scaling µ is built µ(τ) = 1− r(1− τT )b (10) where T is the number of generations, τ is the current it- eration, and the weight r ∈ [0, 1]. From the formula (10) it can be seen that for a small value of weight r, the mutation scaling µ is near to one at the beginning of the optimiza- tion, and the mutation scaling µ will be decreasing down to zero as the run progresses. 4) Adaptive strategy of stage evolution During the process of evolution, the diversity of popu- lation is descending. When the diversity decreases to a certain level, the algorithm searching is over[1]. Generally, at a previous stage larger crossover and mutation probabi- lity can work obviously, while at a later stage the crossover and mutation probability had better be smaller since the algorithm has entered into the local searching process. For the selection operator, it is a good idea to choose smaller selection pressure at the beginning, and adopt larger selec- tion pressure later to promote local searching. Based on this idea, a model based on stage evolution is developed. We divide the whole process into 3 stages: First stage: τ ∈ [0, T1] T1 = αT Second stage: τ ∈ (T1, T2] T2 = (1− α)T Third stage: τ ∈ (T2, T ] where T and τ have been defined as above, generally pa- rameter α is equal to 0.382. Then we choose three different best individual′s selection probability q, crossover proba- bility Pc and mutation probability Pm for each stage. 2.2 Hybrid simplex-improved genetic algorithm (HSIGA) In order to improve the local fine tuning capability of GA and quicken the convergence rate, we combine the IGA with the simplex method (SM) to form a hybrid optimization algorithm[7]. The detailed process is as follows. All chromosomes in the current generation are arranged from the best to the worst firstly, and new population in the next generation is generated through the following three approaches. 1) Elitist strategy: The first N top-ranking chromosomes (elites) are reproduced directly into the next generation so that these elites can not be destroyed by the 3 operations of the GA or other operations. 2) Modified simplex strategy: In this HSIGA algorithm, the S (S > N) top members in population produce S -N new chromosomes through the modified simplex method. In modified simplex method, the new generated chromosome is generated by reflecting x j over the centroid xc of the remaining points as follows. xnewj = x c + α(x c − x j) (j = N + 1, · · · , S) (11) where the centroid is equal to x c = (x 1+x 2+ · · ·+xN )/N , α is a random value. 3) Improved genetic algorithm (IGA) strategy: The re- maining P -S children (where P is the population size) in the new generation are created through the IGA acting on the whole population. Fig.1 depicts the architecture of this HSIGA algorithm. We can refer to the hybrid degree (S/P ) by using the per- centage of population to which the modified simplex opera- tor is applied. From it we can see that the hybrid algorithm will become a real-code IGA when the hybrid degree (S/P ) is zero; while the hybrid degree (S/P ) is equal to 100%, the algorithm will turn into the modified simplex method. Generally S is around 20 percent of the size P . No. 1 REN Zi-Wu et al.: Hybrid Simplex-improved Genetic Algorithm for Global Numerical Optimization 93 Fig.2 Procedure of the HSIGA algorithm Fig.1 Architecture of the hybrid simplex-IGA algorithm The new population in HSIGA is produced through these 3 strategies with the following advantages. 1) In GA the coding of elitists would be changed or de- stroyed after the genetic operation, and the produced off- spring may be less fitness than their parents. Elitist stra- tegy is an effective approach to avoid the damage of the elitists, which is necessary to enhance the capacity of the algorithm. 2) Some novel genetic operations are used in IGA, such as the crossover and the mutation operator. The idea of these operations is mainly from the nature. The genetic mech- anisms try to mimic the maturing phenomenon in nature, which makes the individuals more suitable to the environ- ment, and enhances the optimization performance. 3) GA is global search procedure. It is less likely to be trapped in local optima, but the convergence rate will slow down and the computational cost is high at later stage. SM is a local search method whose merits are simple and com- putationally efficient, however it is easily entrapped in a local optimum. The hybrid of these two methods can com- bine the respective merits, which can speed up the conver- gence rate and avoid being entrapped in a local optimum. Moreover, HSIGA applies simplex reproduction to the top- ranking individuals, and applies simplex search to many points not a single one in the vicinity of optimum, which can quicken the convergence rate greatly. Based on the above, the pseudo code of this HSIGA al- gorithm can be depicted in Fig.2. 3 Numerical experiment and results We execute the HSIGA to solve 10 benchmark functions[8∼12] in Table 1. f1-f6 are multimodal functions where the number of local minima increases with the prob- lem dimension; f7-f10 are unimodal functions. In some recent studies, functions f1-f10 were tested by the FEP[8], OGA/Q[9], CEP[10], PSO[11], EO[11], and FES[12] optimization algorithms. From the results of these numerical experiments, it can be seen that the OGA/Q[9] can give more robust and significantly better results than some other optimization algorithms, such as the CEP[10], PSO[11], EO[11], FES[12] etc. Therefore we adopt the ten benchmark functions to test our proposed HSIGA, and only to compare the performance of the HSIGA with the per- formance of FEP[8] and OGA/Q[9]. 3.1 Control experiment In order to identify any improvement due to improved genetic operations and combination with the modified sim- plex method, the following control experiments are de- signed and carried out. We execute the IGA and SGA to solve the test functions, where IGA is similar to HSIGA basically, except for the top members’ number S in the population. In IGA the number of top members S is equal to zero, while in HSIGA S is 20 percent of the size P . SGA neither apply any improved genetic mechanisms nor com- bines with other optimization method; it adopts only tra- ditional operations, such as fitness-proportionate selection, arithmetic crossover and uniform mutation operation. 3.2 Parameter values Before solving these test functions, some parameters should be assigned for each algorithm. 94 ACTA AUTOMATICA SINICA Vol. 33 Table 1 List of 10 test functions used in experimental study, where n is the function dimension, n=30 Test functions Solution space f1(x) = nP i=1 (−xi sin p|xi|) [−500, 500]n f2(x) = nP i=1 [x2i − 10 cos(2pixi) + 10] [−5.12, 5.12]n f3(x) = −20 exp −0.2 s 1 n nP i=1 x2i ! − exp 1 n nP i=1 cos(2pixi) ! + 20 + exp(1) [−32, 32]n f4(x) = 1 4000 nP i=1 x2i − nQ i=1 cos( xi√ i ) + 1 [−600, 600]n f5(x) = pi n ( 10 sin2(piy1) + n−1P i=1 (yi − 1)2[1 + 10 sin2(piyi+1)] + (yn − 1)2 ) + nP i=1 u(xi, 10, 100, 4) u(xi, a, k,m) = 8><>: k(xi − a)m 0 k(−xi − a)m xi > a −a 6 xi 6 a; xi < −a yi = 1 + 1 4 (xi + 1) [−50, 50]n f6(x) = 1 10 ( sin3(3pix1) + n−1P i=1 (xi − 1)2[1 + sin2(3pixi+1)] + (xn − 1)2[1 + sin2(2pixn)] ) + nP i=1 u(xi, 5, 100, 4) [−50, 50]n f7(x) = nP i=1 x2i [−100, 100]n f8(x) = nP i=1 |xi| + nQ i=1 |xi| [−10, 10]n f9(x) = nP i=1 iP j=1 xj !2 [−100, 100]n f10(x) = max {|xi| , i = 1, 2, · · · , n} [−100, 100]n For the HSIGA and the IGA algorithms: • Population size: We choose a moderate population size P = 60. • Genetic probabilities: The whole evolution is divided into 3 stages. Table 2 shows the best chromosome’s selec- tion probability q, crossover probability Pc and mutation probability Pm at each stage. • Mutation parameters: The parameter r in the muta- tion scaling function is 0.5, and parameter b is equal to 2. • Stopping criterion: For each function has different com- plexity, we use different stopping criterion. Table 3 lists the evolution generations T for each function. When the cur- rent iteration τ reaches T , the execution is stopped. In addition, the modified simplex strategy is adopted in the HSIGA algorithm.We choose the hybrid degree of the HSIGA algorithm (S/P )=20%, and elites N is equal to four. In IGA the hybrid degree is equal to zero. For the SGA algorithm: • Population size: Population size of the SGA is 150, which is larger than that of the IGA and the HSIGA. • Selection operation: Fitness-proportionate selection is adopted. • Crossover operation: Arithmetic crossover is employed and crossover probability Pc is 0.80. • Mutation operation: Uniform mutation is used and mutation probability Pm is 0.02. • Stopping criterion: In SGA if the fitness value of the best chromosomes cannot be further reduced in successive 50 generations after 500 generations, the execution of the algorithm is stopped. 3.3 Results and comparison Since the SGA uses different evolutionary parameters and termination criterion from those adopted by the IGA and the HSIGA, to make a fair comparison, we will calcu- late firstly the computational cost of each algorithm, and compare the qualities of their final solutions at the given computational cost. Each test function is performed in 50 independent runs and the following results are recorded: 1) the mean number of function evaluations; 2) the mean function value; and 3) the standard deviation of the function values. Table 4 shows each algorithm’s results in the control experiment. From these results in Table 4 it can be seen that • As can be seen, HSIGA finds the exact global optimum, 0, for functions f2, f4 and f7∼f10. For other functions the mean function values of HSIGA are close to the optimal ones and the standard deviations are small. These results indicate that HSIGA can find optimal or close-to-optimal solutions, and its solution quality is quite stable. • Compared to the results of SGA, the proposed HSIGA algorithm requires less function evaluations than SGA, and hence it has a smaller time complexity.However, HSIGA gives significantly smaller and closer-to-optimal solution than SGA, and hence its mean solution quality is much better than SGA. In addition, HSIGA gives smaller stan- dard deviations of function values than SGA, so its solution quality is more stable. • Compared to the results of IGA, though the gener- ations T of these two algorithms are equal, HSIGA re- quires less function evaluations than IGA. This is because part individuals of next generation in HSIGA are produced through elite strategy and modified simplex strategy. Ho- wever, HSIGA gives smaller mean function values, and gives equal or smaller standard deviations of function values than IGA for the 10 functions, and hence the solution qual- ity of HSIGA is better than that of IGA, and the HSIGA algorithm is statistically stable. Next, the performance of HSIGA is compared with the that of FEP[8] and OGA/Q[9] algorithms. Since in [8] and [9] the optimization results using these two algorithms are available for the 10 test functions, the comparison will be made accordingly and the results are listed in Table 5. No. 1 REN Zi-Wu et al.: Hybrid Simplex-improved Genetic Algorithm for Global Numerical Optimization 95 Table 2 Adaptive change value of parameter (τ : Current iteration; T : Evolution generations) τ q Pc Pm τ ∈ [0, T1] 0.08 0.95 0.08 τ ∈ (T1, T2] 0.10 0.80 0.05 τ ∈ (T2, T ] 0.12 0.65 0.02 Table 3 HSIGA and IGA number of generations for each function (TF: Test function NG: Number of generations) TF f1 f2 f3 f4 f5 f6 f7 f8 f9 f10 NG 500 30 50 40 500 500 400 500 400 500 Table 4 Comparison of optimization result and computation effort among HSIGA, IGA and SGA Mean number of function evaluations Mean function value (standard deviation) f HSIGA IGA SGA HSIGA IGA SGA fmin f1 68,412 78,052 89,877 -12569.4740 (1.0604E-4) -12569.4530 (1.3782E-2) -6422.1504 (869.3862) -12569.5 f2 4,130 4,698 259,965 0 (0) 7.0237E-14 (2.4246E-13) 1.9255 (8.0727) 0 f3 6,853 7,825 229,926 8.8818E-16
/
本文档为【全局数值寻优的一种混合遗传算法】,请使用软件OFFICE或WPS软件打开。作品中的文字与图均可以修改和编辑, 图片更改请在作品中右键图片并更换,文字修改请直接点击文字进行修改,也可以新增和删除文档中的内容。
[版权声明] 本站所有资料为用户分享产生,若发现您的权利被侵害,请联系客服邮件isharekefu@iask.cn,我们尽快处理。 本作品所展示的图片、画像、字体、音乐的版权可能需版权方额外授权,请谨慎使用。 网站提供的党政主题相关内容(国旗、国徽、党徽..)目的在于配合国家政策宣传,仅限个人学习分享使用,禁止用于任何广告和商用目的。

历史搜索

    清空历史搜索