A Multi-Objective Relative Clustering Genetic Algorithm with Adaptive Local/Global Search based on...

12
A Multi-Objective Relative Clustering Genetic Algorithm with Adaptive Local/Global Search based on Genetic Relatedness Iman Gholaminezhad 1 and Giovanni Iacca 2 1 Department of Mechanical Engineering, University of Guilan, Iran [email protected] 2 INCAS 3 Dr. Nassaulaan 9, 9401 HJ, Assen, The Netherlands [email protected] Abstract. This paper describes a new evolutionary algorithm for multi- objective optimization, namely Multi-Objective Relative Clustering Ge- netic Algorithm (MO-RCGA), inspired by concepts borrowed from gene relatedness and kin selection theory. The proposed algorithm clusters the population into different families based on individual kinship, and adaptively chooses suitable individuals for reproduction. The idea is to use the information on the position of the individuals in the search space provided by such clustering schema to enhance the convergence rate of the algorithm, as well as improve its exploration. The proposed algo- rithm is tested on ten unconstrained benchmark functions proposed for the special session and competition on multi-objective optimizers held at IEEE CEC 2009. The Inverted Generational Distance (IGD) is used to assess the performance of the proposed algorithm, in comparison with the IGD obtained by state-of-the-art algorithms on the same benchmark. Keywords: Multi-objective optimization, Relative clustering, Genetic relatedness, Inverted generational distance. 1 Introduction According to classic Darwinian theory, natural selection promotes those indi- viduals which behave in their own selfish interests, rather than for the good of their species or for the good of the group in which they live. However, nature offers many examples of social animals (such as eusocial insects, e.g. bees and ants) which do not behave selfishly all the time, but under some conditions they rather tend to cooperate with other members of their colony, for the good of the group as a whole. In fact, in these cases natural selection favors individuals who maximize their genetic contribution to future generations through cooperation with their kin, even if this altruistic behaviour comes with an individual cost [8]. In the last two decades, many evolutionary algorithms (EAs) have been devel- oped based on Darwin’s theory and social behaviour, and applied to the solution

Transcript of A Multi-Objective Relative Clustering Genetic Algorithm with Adaptive Local/Global Search based on...

A Multi-Objective Relative Clustering GeneticAlgorithm with Adaptive Local/Global Search

based on Genetic Relatedness

Iman Gholaminezhad1 and Giovanni Iacca2

1 Department of Mechanical Engineering,University of Guilan, Iran

[email protected] INCAS3

Dr. Nassaulaan 9, 9401 HJ, Assen, The [email protected]

Abstract. This paper describes a new evolutionary algorithm for multi-objective optimization, namely Multi-Objective Relative Clustering Ge-netic Algorithm (MO-RCGA), inspired by concepts borrowed from generelatedness and kin selection theory. The proposed algorithm clustersthe population into different families based on individual kinship, andadaptively chooses suitable individuals for reproduction. The idea is touse the information on the position of the individuals in the search spaceprovided by such clustering schema to enhance the convergence rate ofthe algorithm, as well as improve its exploration. The proposed algo-rithm is tested on ten unconstrained benchmark functions proposed forthe special session and competition on multi-objective optimizers heldat IEEE CEC 2009. The Inverted Generational Distance (IGD) is usedto assess the performance of the proposed algorithm, in comparison withthe IGD obtained by state-of-the-art algorithms on the same benchmark.

Keywords: Multi-objective optimization, Relative clustering, Geneticrelatedness, Inverted generational distance.

1 Introduction

According to classic Darwinian theory, natural selection promotes those indi-viduals which behave in their own selfish interests, rather than for the good oftheir species or for the good of the group in which they live. However, natureoffers many examples of social animals (such as eusocial insects, e.g. bees andants) which do not behave selfishly all the time, but under some conditions theyrather tend to cooperate with other members of their colony, for the good of thegroup as a whole. In fact, in these cases natural selection favors individuals whomaximize their genetic contribution to future generations through cooperationwith their kin, even if this altruistic behaviour comes with an individual cost [8].

In the last two decades, many evolutionary algorithms (EAs) have been devel-oped based on Darwin’s theory and social behaviour, and applied to the solution

of complex optimization problems. Successful examples of EAs can be found inparticular in the context of multi-objective and dynamic optimization [20, 25].Nonetheless, rarely EAs have shown the full range of properties exhibited bynatural evolution, being instead limited to a coarse and somewhat simplisticapproximation of what happens in nature.

A typical technique used to improve multi-objective EAs, as well as swarm in-telligence algorithms, is clustering. Clustering is generally considered to facilitatethe exploitation process and decrease the computational time to reach conver-gence. Several examples of clustering-based algorithms exist in the literature.Gong et al. suggested a clustering-based selection strategy of non-dominated in-dividuals by partitioning the non-dominated solutions in each Pareto front intothe desired clusters [6]. Tsang and Lau proposed a clustering-based artificialimmune system focusing on distributed self-organization, by means of popula-tion decomposition and independent evolutionary processes [17]. Moubayed etal. used a clustering-based approach for leader selection in multi-objective parti-cle swarm optimization. In this method better leaders are identified by indirectmapping between objectives and solution clusters [12]. Wang et al. introduced aclustering multi-objective evolutionary algorithm based on orthogonal and uni-form design. In this case the orthogonal design generates an initial populationof solutions that are scattered uniformly over the search space, while cluster-ing is applied in later stages of the optimization [19]. A similar approach wasproposed also by Gao and Zhong, who developed a clustering-based two-phasemulti-objective particle swarm optimization in which clustering is applied aftera distribution-based generation of the initial population [4].

In this paper we use clustering to build a model of social behaviour andapply it to a multi-objective evolutionary algorithm. As mentioned before, socialanimals transmit to the next generations not only their own genes, but also - bymeans of kin selection - their kin’s genes (i.e., offspring and/or siblings, whichare all characterized by some level of genetic relatedness). In order to develop asimulated model of such behaviour and use it in an optimization algorithm, thefirst step is to determine the kinship between the individuals in the population,and use it as basis for clustering the whole population into different families. Todecompose the current population into different families, each couple of parentsmust be clustered together with their corresponding offspring (generated bymeans of crossover and mutation) as well as possible half-siblings (which canbe generated because each parent could be selected for reproduction more thanonce, with different partners). By considering that individuals laying in the samefamily have some similar genes (i.e., similar variables), it is then possible todesign an algorithm in which both family competition and individual competitionoccur, with the purpose of transmitting individual and kin’s genes to the nextgenerations. Here, we use this concept to devise a novel selection strategy which,different from classic selection schemes such as tournament selection or fitness-proportionate selection, embeds naturally local and global search. The proposedmulti-objective optimization algorithm based on such strategy, namely Multi-Objective Relative Clustering Genetic Algorithm (MO-RCGA), is tested on ten

unconstrained functions taken from the IEEE CEC 2009 benchmark [23], andcompared against 15 state-of-the-art multi-objective optimization algorithms.

The rest of the paper is structured as follows: the next section illustratesthe working principles of the proposed MO-RCGA, while Section 3 presents thenumerical results. Finally, Section 4 concludes this work and suggests possiblefuture research lines.

2 Relative Clustering Genetic Algorithm With AdaptiveLocal/Global Search

As anticipated in the previous section, the proposed MO-RCGA clusters theindividuals into different families, based on their level of kinship (in particu-lar, parents, their children, half-siblings, and cousins lay in the same family).This is done by indexing all parents and their associated offspring produced byrecombination and mutation.

Recombination is performed by means of n-point crossover, being n a numberbetween 0 and the individual length (i.e., the problem dimension): for each geneof the parents which are selected for crossover, a uniform random number isdrawn in [0, 1]; if this number is bigger than a predefined probability of crossoveralteration (PCA), the corresponding gene of the two parents are swapped.

In the mutation operator, the genes of each parent can be chosen to mutatedepending on a predefined probability of mutation alteration (PMA). If the genexi,j is selected for mutation (where i and j are respectively the individual andgene index), a uniform random number is drawn within the interval [−var,+var](being var a given parameter), and the latter number is added to xi,j .

Considering the fact that each individual can participate more than oncein crossover, its genes may exist in different families. Hence, parents with theirassociate offspring produced by crossover and mutation, as well as offspringwhich have only one of the two parents in common (i.e., half-siblings), lay inthe same family. By clustering the whole population into different families afterthe execution of the genetic operators (i.e., crossover and mutation), each familycontains individuals which have some similar genes and thus are close to eachother in the problem search space.

After such clustering, the algorithm selects the fittest individuals by com-paring the individuals from the previous generation with the newly crossed andmutated individuals, and transmits them to the next generation. Selection isperformed using the clustered families to choose suitable individuals for repro-duction. A first rank-based method is used to select the fittest families. Thisis done by ranking all families in terms of number of individuals evolved fromeach family to the next generation. Hence, the higher the number of individualspassed from a family to the next generation, the fittest that family is.

Now, remembering that individuals within the same family are closer to eachother than with respect to individuals from different families, it is obvious thatchoosing for reproduction individuals within the same family (kin selection) isequivalent to perform a local search in the neighbourhood of that family. The

latter is very important especially in the later stages of the optimization, whenthe algorithm needs to refine the search. On the other hand, if the individualsselected for reproduction are from different families (selfish selection), they areprobably far from each other in the search space (especially in the earlier stages ofthe algorithm), thus allowing for a more global exploration. The latter is moreimportant at the beginning of the optimization process, when the algorithmneeds to search for the global optimum region and avoid local optima.

In order to control which strategy must be used, the algorithm adaptivelyadjusts the probability of local and global search in each generation. At the earlyiterations, when global exploration is more essential, the algorithm has a higherchance of choosing individuals from different families. Such inter-family selectionis performed by comparing the family ranks, so that it is more probable that theselected individual belongs to the highest ranked family (roulette-wheel selec-tion). On the other hand, as the number of iterations increases, the probabilityof performing intra-family selection (i.e., selecting for reproduction individualswithin the same family) is increased. Also in this case, families selected for repro-duction are chosen through a rank-based roulette-wheel. It should be noted thatfor the initial generation, when families do not exist yet, a classic tournamentselection scheme is used to select individuals for crossover and mutation.

Basically, by such adaptation of global and local search based on the indi-vidual relatedness, the algorithm is able to control the population diversity, amechanism similar to the classic incest prevention scheme used in the CHC al-gorithm [3]. To adapt the probability of global and local search smoothly alongthe generations, we use the following set of equations:

range = range1 +ngen

Ngen

· (range2 − range1) (1)

{

ProbG = rangeProbL = 1− range

(2)

where range1 and range2 are predetermined boundary probability values forthe global search, while ngen and Ngen are respectively the index of the currentgeneration and the maximum number of generations allotted to the evolution-ary algorithm. ProbG and ProbL indicate respectively the probability of globalsearch and local search.

The flowchart of MO-RCGA is illustrated in Figure 1. For the sake of clarity,we also report its pseudo-code in Algorithm 1. With reference to the pseudo-code,Npop indicates the population size, rand(0, 1) is a uniform random number drawnin [0, 1], and PC and PM are respectively the individual probability of crossoverand mutation (i.e., the probability that an individual is chosen for crossover andmutation; on the other hand, PCA and PMA, as defined above, are applied atgene level).

3 Numerical Results

In this section, the performance of the proposed MO-RCGA is assessed on theoptimization of ten real-parameter multi-objective benchmark functions defined

Fig. 1. Flowchart of MO-RCGA.

at the CEC 2009 special session on multi-objective optimization [23]. Amongthese functions, UF1-UF7 are two-objective while UF8-UF10 are three-objectiveoptimization problems. The detailed formulations of the considered test functionsare given in [23]. As suggested in the CEC 2009 platform, in the present work thetotal number of function evaluations Nfevals is set as 300000 for each algorithmexecution. As also indicated in [23], the population size Npop is set to 100 fortwo-objective problems and 150 for three-objective functions. It should be notedthat the maximum number of generations Ngen used in eq. (1) is computedbased on the predefined number of function evaluations and population size (i.e.,Ngen = Nfevals/Npop). All the other specific parameters of MO-RCGA used inthe experimental setup are given in Table 1.

The proposed algorithm has been executed 30 times for each test function,and the average results obtained by MO-RCGA were compared with the resultsof all the algorithms participating in the CEC 2009 competition [2, 5, 7, 9–11,13, 15, 16, 18, 19, 21, 22], as well as two more recent multi-objective optimizationalgorithms [1, 14]. The performance indicator used to quantify the quality of theobtained results is the IGD (Inverted Generational Distance) metric [23]. TheIGD is defined as follows. Let P ∗ be a set of uniformly distributed points (in theobjective space) along the Pareto Front (PF). Let A be an approximate set ofthe PF. The IGD is then defined as average distance from P ∗ to A, namely:

IGD(A,P ∗) =

v∈P∗ d(v, a)

|P ∗|(3)

where d(v,A) is the minimum Euclidean distance between each point v in thePF and all the points in the approximate set A. |P ∗| indicates the cardinalityof the PF. If |P ∗| is large enough to represent the PF well, IGD(A,P ∗) can beused as a measure of both diversity and convergence of A to the PF. If the setA is close to the PF and covers it entirely, IGD(A,P ∗) will obviously take alow (tending to zero) value. In our experiments, the number of solutions usedfor computing the IGD (i.e., the cardinality of the approximate set A) is set

Algorithm 1 Pseudo-code of MO-RCGA.

// initializationrandom initialization of the initial population

{

x1, x2, . . . , xNpop

}

ngen = 0while ngen < Ngen do

// select individuals for reproductionif ngen == 0 then

for i = 1 . . . Npop do

tournament selectionend for

else

// family clustering and rankingfamily clustering based on individual kinshipfamily ranking based on number of individuals evolved from each family// global/local search adaptationupdate of ProbL and ProbG according to eq. (1) and (2)for i = 1 . . . Npop do

if rand(0, 1) < ProbG then

// global searchrank-based selection from different familiesselection of individuals from the selected families

else

// local searchrank-based selection of a familyselection of individuals from the selected family

end if

end for

end if

// genetic operators (reproduction)for i = 1 . . . Npop do

if rand(0, 1) < PC then

n-point crossoverend if

if rand(0, 1) < PM then

mutationend if

end for

// Pareto dominance selectionnext generation population selection based on dominancengen = ngen + 1

end while

equal to the population size, therefore 100 for two-objective problems and 150for three-objective problems.

Numerical results are reported in Table 2-3. More specifically, Table 2 showsthe minimum (Min), maximum (Max), Mean, and Standard Deviation (SD) ofthe IGD obtained in the 30 runs of MO-RCGA. Table 3 shows the comparativeresults of the proposed algorithm with those of the competing algorithms, in

the form of mean IGD and its SD obtained through the 30 independent runs.It can be observed that MO-RCGA outperforms in terms of mean IGD all theother algorithms for all the functions except UF7, where it ranks second afterthe MOEAD algorithm. Finally, Figures 2-11 show the Pareto Front obtainedby the proposed algorithm in the best run and the optimal (theoretical) ParetoFront of the ten test problems. It can be seen visually that MO-RCGA is ableto detect uniformly the PF on all the test functions, except for the case of UF7where it fails at covering only the upper leftmost part of it.

Table 1. Parametersetting of MO-RCGA.

Parameter Value

range1 0.8range2 0.2PC 1PM 0.2var 0.1PCA 0.5PMA 0.2

Table 2. IGD values obtained with MO-RCGA onfunctions UF1-UF10 from the CEC 2009 benchmark(30 independent runs).

Function Minimum Maximum Mean Std. Dev.

UF1 0.00414 0.00431 0.00419 5.22E-04UF2 0.00407 0.00502 0.00443 1.89E-04UF3 0.00423 0.00584 0.00491 8.12E-03UF4 0.00401 0.00522 0.00489 2.66E-04UF5 0.01283 0.01511 0.01377 5.22E-03UF6 0.00441 0.00668 0.00536 9.98E-02UF7 0.00692 0.00882 0.00703 7.32E-03UF8 0.04308 0.07640 0.0522 3.03E-02UF9 0.02681 0.03985 0.0297 4.53E-03UF10 0.06112 0.08321 0.0765 8.24E-03

Fig. 2. Final PF set on UF1. Fig. 3. Final PF set on UF2.

Fig. 4. Final PF set on UF3. Fig. 5. Final PF set on UF4.

Fig. 6. Final PF set on UF5. Fig. 7. Final PF set on UF6.

Fig. 8. Final PF set on UF7. Fig. 9. Final PF set on UF8.

Fig. 10. Final PF set on UF9. Fig. 11. Final PF set on UF10.

4 Conclusion

In this work we introduced a new algorithm for solving multi-objective optimiza-tion problems, namely Multi-Objective Relative Clustering Genetic Algorithm(MO-RCGA). Inspired by the concepts of kin selection and genetic relatedness,the proposed algorithm iteratively clusters the individuals in the populationinto different families, based on their level of kinship (i.e., parents, offspringand half-siblings). Selection of individuals for reproduction is thus performed atboth intra-family and inter-family level. Since individuals within a family havesome similar genes and hence are closer to each other in the search space thanindividuals from different families, by selecting intra-family individuals the algo-rithm favors local search, while selecting inter-family individuals global search ispromoted. An adaptive scheme is presented which balances the two levels of se-lection during the different stages of the optimization process, thus guaranteeingan optimal trade-off between exploration and exploitation. The performance ofMO-RCGA is assessed in comparison with 15 state-of-the-art multi-objective op-timization algorithms on ten benchmark functions from the CEC 2009 testbed.Numerical results, expressed in terms of Inverted Generational Distance, showthat the proposed algorithm is extremely competitive on all the different func-tions and against all the considered competing algorithms.

In our future research, we will extend this study on a larger experimentalsetup, possibly including real-world applications. Also, we will include in thecomparison alternative adaptive mechanisms, such as the ensemble of neigh-bourhood sizes proposed in [24], and we will try to apply the proposed adaptiveselection scheme to single-objective optimization. Finally, from an algorithmicpoint of view, we will try to improve upon the current implementation of MO-RCGA, for example introducing into the algorithm a two-phase scheme includingan efficient design of the initial population.

Acknowledgements

INCAS3 is co-funded by the Province of Drenthe, the Municipality of Assen, theEuropean Fund for Regional Development and the Ministry of Economic Affairs,Peaks in the Delta.

References

1. Akbari, R., Ziarati, K.: Multi-Objective bee swarm optimization. InternationalJournal of Innovative Computing Information and Control 8(1B), 715–726 (2012)

2. Chen, C.M., Chen, Y.P., Zhang, Q.: Enhancing MOEA/D with guided mutationand priority update for multi-objective optimization. In: IEEE Congress on Evo-lutionary Computation. pp. 209–216 (2009)

3. Eshelman, L.J.: The CHC Adaptive Search Algorithm: How to Have Safe SearchWhen Engaging in Nontraditional Genetic Recombination. Foundations of GeneticAlgorithms pp. 265–283 (1991)

4. Gao, H., Zhong, W.: Multiobjective Optimization Using Clustering Based TwoPhase Particle Swarm Optimization. In: International Conference on Natural Com-putation. vol. 6, pp. 520–524 (2008)

5. Gao, S., Zeng, S., Xiao, B., Zhang, L., Shi, Y., Tian, X., Yang, Y., Long, H., Yang,X., Yu, D., Yan, Z.: An orthogonal multi-objective evolutionary algorithm withlower-dimensional crossover. In: IEEE Congress on Evolutionary Computation.pp. 1959–1964 (2009)

6. Gong, M., Cheng, G., Jiao, L., Liu, C.: Clustering-based selection for evolution-ary multi-objective optimization. In: IEEE International Conference on IntelligentComputing and Intelligent Systems (2009)

7. Huang, V.L., Zhao, S.Z., Mallipeddi, R., Suganthan, P.N.: Multi-objective opti-mization using self-adaptive differential evolution algorithm. In: IEEE Congresson Evolutionary Computation. pp. 190–194 (2009)

8. Krebs, J.R., Davies, N.B.: An Introduction to Behavioural Ecology. Blackwell Pub-lishing, Inc. (1993)

9. Kukkonen, S., Lampinen, J.: Performance assessment of Generalized DifferentialEvolution 3 with a given set of constrained multi-objective test problems. In: IEEECongress on Evolutionary Computation. pp. 1943–1950 (2009)

10. Liu, H.L., Li, X.: The multiobjective evolutionary algorithm based on determinedweight and sub-regional search. In: IEEE Congress on Evolutionary Computation.pp. 1928–1934 (2009)

11. Liu, M., Zou, X., Chen, Y., Wu, Z.: Performance assessment of DMOEA-DD withCEC 2009 MOEA competition test instances. In: IEEE Congress on EvolutionaryComputation. pp. 2913–2918 (2009)

12. Moubayed, N.A., Petrovski, A., McCall, J.A.W.: Clustering-Based Leaders’ Se-lection in Multi-Objective Particle Swarm Optimisation. In: Yin, H., Wang, W.,Rayward-Smith, V.J. (eds.) Intelligent Data Engineering and Automated Learning.Lecture Notes in Computer Science, vol. 6936, pp. 100–107. Springer (2011)

13. Qu, B.Y., Suganthan, P.N.: Multi-objective evolutionary programming withoutnon-domination sorting is up to twenty times faster. In: IEEE Congress on Evolu-tionary Computation. pp. 2934–2939 (2009)

14. Rao, V., Patel, V.: Comparative performance of an elitist teaching-learning-basedoptimization algorithm for solving unconstrained optimization problems. Interna-tional Journal of Industrial Engineering Computations 4(1), 29–50 (2013)

15. Sindhya, K., Sinha, A., Deb, K., Miettinen, K.: Local search based evolutionarymulti-objective optimization algorithm for constrained and unconstrained prob-lems. In: IEEE Congress on Evolutionary Computation. pp. 2919–2926 (2009)

16. Tiwari, S., Fadel, G., Koch, P., Deb, K.: Performance assessment of the hybridArchive-based Micro Genetic Algorithm (AMGA) on the CEC09 test problems.In: IEEE Congress on Evolutionary Computation. pp. 1935–1942 (2009)

17. Tsang, W.W.P., Lau, H.Y.K.: Clustering-Based Multi-objective Immune Optimiza-tion Evolutionary Algorithm. In: Proceedings of the 11th International Conferenceon Artificial Immune Systems. pp. 72–85. ICARIS’12, Springer-Verlag, Berlin, Hei-delberg (2012)

18. Tseng, L.Y., Chen, C.: Multiple trajectory search for unconstrained/constrainedmulti-objective optimization. In: IEEE Congress on Evolutionary Computation.pp. 1951–1958 (2009)

19. Wang, Y., Dang, C., Li, H., Han, L., Wei, J.: A clustering multi-objective evolu-tionary algorithm based on orthogonal and uniform design. In: IEEE Congress onEvolutionary Computation. pp. 2927–2933 (2009)

20. Yang, S., Ong, Y.S., Jin, Y. (eds.): Evolutionary Computation in Dynamic andUncertain Environments, Studies in Computational Intelligence, vol. 51. Springer(2007)

21. Zamuda, A., Brest, J., Boskovic, B., Zumer, V.: Differential Evolution with Self-adaptation and Local Search for Constrained Multiobjective Optimization. In:IEEE Congress on Evolutionary Computation. pp. 195–202 (2009)

22. Zhang, Q., Liu, W., Li, H.: The performance of a new version of MOEA/D onCEC09 unconstrained MOP test instances. In: IEEE Congress on EvolutionaryComputation. pp. 203–208 (2009)

23. Zhang, Q., Zhao, A., Suganthan, P.N., Liu, W., Tiwari, S.: Multi-objective opti-mization test instances for the CEC 2009 special session and competition. Tech.Rep. CES487, University of Essex and Nanyang Technological University (2008)

24. Zhao, S.Z., Suganthan, P.N., Zhang, Q.: Decomposition-Based Multiobjective Evo-lutionary Algorithm With an Ensemble of Neighborhood Sizes. IEEE Transactionson Evolutionary Computation 16(3), 442–446 (2012)

25. Zhou, A., Qu, B.Y., Li, H., Zhao, S.Z., Suganthan, P.N., Zhang, Q.: Multiobjectiveevolutionary algorithms: A survey of the state of the art. Swarm and EvolutionaryComputation 1(1), 32–49 (2011)

Table 3. Comparison of mean IGD and standard deviation (SD) obtained by MO-RCGA and 15 competing algorithms on functionsUF1-UF10 from CEC 2009 benchmark (30 runs).

Algorithm IGD UF1 UF2 UF3 UF4 UF5 UF6 UF7 UF8 UF9 UF10

MO-RCGAMean 0.00419 0.00443 0.00491 0.00489 0.01377 0.00536 0.00703 0.0522 0.0297 0.0765

SD 5.22E-04 1.89E-04 8.12E-03 2.66E-04 5.22E-03 9.9E-02 7.32E-03 3.03E-02 4.53E-03 8.24E-03

MO-ITLBOMean 0.00421 0.00519 0.04681 0.04378 0.07482 0.01144 0.04127 0.06126 0.12379 0.14714SD 8.04E-04 1.73E-03 6.48E-03 1.07E-02 8.62E-03 1.01E-02 2.38E-02 1.65E-03 8.97E-02 1.29E-02

MOABCMean 0.00618 0.00484 0.0512 0.05801 0.07775 0.06537 0.05573 0.06726 0.0615 0.19499SD NA NA NA NA NA NA NA NA NA NA

MTSMean 0.0066 0.00615 0.0531 0.02356 0.01489 0.05917 0.04079 0.11251 0.11442 0.15306SD 3.49E-04 5.08E-04 1.17E-02 6.64E-04 3.28E-03 1.06E-02 1.44E-02 1.29E-02 2.55E-02 1.58E-02

DMOEADDMean 0.01038 0.00679 0.03337 0.4268 0.31454 0.06673 0.01032 0.06841 0.04896 0.32211SD 2.37E-03 2.02E-03 5.68E-03 1.39E-03 4.66E-02 1.03E-02 9.46E-03 9.12E-03 2.23E-02 2.86E-01

LiuLi AlgorithmMean 0.00785 0.0123 0.01497 0.0435 0.16186 0.17555 0.0073 0.08235 0.09391 0.44691SD 2.09E-03 3.32E-03 2.4E-02 6.5E-04 2.82E-02 8.29E-02 8.9E-04 7.33E-03 4.71E-02 1.3E-01

GDE3Mean 0.00534 0.01195 0.10639 0.0265 0.03928 0.25091 0.02522 0.24855 0.08248 0.43326SD 3.42E-04 1.54E-03 1.29E-02 3.72E-04 3.95E-03 1.96E-02 8.89E-03 3.55E-02 2.25E-02 1.23E-02

MOEADMean 0.00435 0.00679 0.00742 0.06385 0.18071 0.00587 0.00444 0.0584 0.07896 0.47415SD 2.90E-04 1.82E-03 5.89E-03 5.34E-03 6.81E-02 1.71E-03 1.17E-03 3.21E-03 5.32E-02 7.36E-02

MOEADGMMean 0.0062 0.0064 0.0429 0.476 1.7919 0.5563 0.0076 0.2446 0.1878 0.5646SD 1.13E-03 4.3E-04 3.41E-02 2.22E-03 5.12E-01 1.47E-01 9.4E-04 8.54E-02 2.87E-02 1.02E-01

NSGAIILSMean 0.01153 0.01237 0.10603 0.0584 0.5657 0.31032 0.02132 0.0863 0.0719 0.84468SD 7.3E-03 9.11E-03 6.86E-02 5.12E-03 1.83E-01 1.91E-01 1.95E-02 1.24E-02 4.5E-02 1.63E-01

OW MOSaDEMean 0.0122 0.0081 0.103 0.0513 0.4303 0.1918 0.0585 0.0945 0.0983 0.743SD 1.2E-03 2.3E-03 1.9E-02 1.9E-03 1.74E-02 2.9E-02 2.91E-02 1.19E-02 2.44E-02 8.85E-02

Clustering MOEAMean 0.0299 0.0228 0.0549 0.0585 0.2473 0.0871 0.0223 0.2383 0.2934 0.4111SD 3.3E-03 2.3E-03 1.47E-02 2.7E-03 3.84E-02 5.7E-03 2.0E-03 2.3E-02 7.81E-02 5.01E-02

AMGAMean 0.03588 0.01623 0.06998 0.04062 0.09405 0.12942 0.05707 0.17125 0.18861 0.32418SD 1.03E-02 3.17E-03 1.4E-02 1.75E-03 1.21E-02 5.66E-02 6.53E-02 1.72E-02 4.21E-02 9.57E-02

MOEPMean 0.0596 0.0189 0.099 0.0427 0.2245 0.1031 0.0197 0.423 0.342 0.3621SD 1.2E-02 3.8E-03 1.32E-02 8.35E-04 3.44E-02 3.45E-02 7.51E-04 5.65E-02 1.58E-01 4.44E-02

DECMOSA-SQPMean 0.07702 0.02834 0.0935 0.03392 0.16713 0.12604 0.02416 0.21583 0.14111 0.36985SD 3.94E-02 3.13E-02 1.98E-01 5.37E-03 8.95E-02 5.62E-01 2.23E-02 1.21E-01 3.45E-01 6.53E-01

OMOEAIIMean 0.08564 0.03057 0.27141 0.04624 0.1692 0.07338 0.03354 0.192 0.23179 0.62754SD 4.07E-03 1.61E-03 3.76E-02 9.67E-04 3.9E-03 2.45E-03 1.74E-03 1.23E-02 6.48E-02 1.46E-01