TY - GEN
T1 - Exploiting second order information in computational multi-objective evolutionary optimization
AU - Shukla, Pradyumn Kumar
N1 - Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2007
Y1 - 2007
N2 - Evolutionary algorithms are efficient population based algorithms for solving multi-objective optimization problems. Recently various authors have discussed the efficacy of combining gradient based classical methods with evolutionary algorithms. This is done since gradient information leads to convergence to Pareto-optimal solutions with a linear convergence rate. However none of existing studies have explored how to exploit second order or Hessian information in evolutionary multi-objective algorithms. Second order information though costly, leads to a quadratic convergence to Pareto-optimal solutions. In this paper, we take Levenberg-Marquardt methods from classical optimization and show two possible ways of hybrid algorithms. These algorithms require gradient and Hessian information which is obtained using finite difference techniques. Computational studies on a number of test problems of varying complexity demonstrate the efficiency of resulting hybrid algorithms in solving a large class of complex multi-objective optimization problems.
AB - Evolutionary algorithms are efficient population based algorithms for solving multi-objective optimization problems. Recently various authors have discussed the efficacy of combining gradient based classical methods with evolutionary algorithms. This is done since gradient information leads to convergence to Pareto-optimal solutions with a linear convergence rate. However none of existing studies have explored how to exploit second order or Hessian information in evolutionary multi-objective algorithms. Second order information though costly, leads to a quadratic convergence to Pareto-optimal solutions. In this paper, we take Levenberg-Marquardt methods from classical optimization and show two possible ways of hybrid algorithms. These algorithms require gradient and Hessian information which is obtained using finite difference techniques. Computational studies on a number of test problems of varying complexity demonstrate the efficiency of resulting hybrid algorithms in solving a large class of complex multi-objective optimization problems.
KW - Local search
KW - Hybrid algorithm
KW - Order information
KW - Steep descent direction
KW - Inverted generational distance
UR - http://www.scopus.com/inward/record.url?scp=38349028646&partnerID=8YFLogxK
U2 - 10.1007/978-3-540-77002-2_23
DO - 10.1007/978-3-540-77002-2_23
M3 - Conference contribution
AN - SCOPUS:38349028646
SN - 9783540770008
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 271
EP - 282
BT - Progress in Artificial Intelligence - 13th Portuguese Conference on Artificial Intelligence, EPIA 2007 Workshops
PB - Springer-Verlag Italia
T2 - 13th Portuguese Conference on Artificial Intelligence, EPIA 2007 Workshops
Y2 - 3 December 2007 through 7 December 2007
ER -