Extremal optimization combined with LM gradient search for MLP network learning

Peng Chen, Yong Zai Lu, Yu Wang Chen

Research output: Contribution to journalArticlepeer-review

Abstract

Gradient search based neural network training algorithm may suffer from local optimum, poor generalization and slow convergence. In this study, a novel Memetic Algorithm based hybrid method with the integration of "extremal optimization" and "Levenberg-Marquardt" is proposed to train multilayer perceptron (MLP) networks. Inheriting the advantages of the two approaches, the proposed "EO-LM" method can avoid local minima and improve MLP network learning performance in generalization capability and computation efficiency. The experimental tests on two benchmark problems and an application example for the end-point-prediction of basic oxygen furnace in steelmaking show the effectiveness of the proposed EO-LM algorithm.
Original languageEnglish
Pages (from-to)622-631
Number of pages9
JournalInternational Journal of Computational Intelligence Systems
Volume3
Issue number5
DOIs
Publication statusPublished - Oct 2010

Keywords

  • "Levenberg-Marquardt" (LM) gradient search
  • Back propagation
  • Extremal optimization
  • Memetic Algorithms
  • Supervised learning

Fingerprint

Dive into the research topics of 'Extremal optimization combined with LM gradient search for MLP network learning'. Together they form a unique fingerprint.

Cite this