Abstract
Gradient search based neural network training algorithm may suffer from local optimum, poor generalization and slow convergence. In this study, a novel Memetic Algorithm based hybrid method with the integration of "extremal optimization" and "Levenberg-Marquardt" is proposed to train multilayer perceptron (MLP) networks. Inheriting the advantages of the two approaches, the proposed "EO-LM" method can avoid local minima and improve MLP network learning performance in generalization capability and computation efficiency. The experimental tests on two benchmark problems and an application example for the end-point-prediction of basic oxygen furnace in steelmaking show the effectiveness of the proposed EO-LM algorithm.
Original language | English |
---|---|
Pages (from-to) | 622-631 |
Number of pages | 9 |
Journal | International Journal of Computational Intelligence Systems |
Volume | 3 |
Issue number | 5 |
DOIs | |
Publication status | Published - Oct 2010 |
Keywords
- "Levenberg-Marquardt" (LM) gradient search
- Back propagation
- Extremal optimization
- Memetic Algorithms
- Supervised learning