Two-Stage Orthogonal Least Squares Methods for Neural Network Construction

Long Zhang, Kang Li, Er-Wei Bai, George W. Irwin

Research output: Contribution to journalArticlepeer-review

Abstract

A number of neural networks can be formulated as the linear-in-the-parameters models. Training such networks can be transformed to a model selection problem where a compact model is selected from all the candidates using subset selection algorithms. Forward selection methods are popular fast subset selection approaches. However, they may only produce suboptimal models and can be trapped into a local minimum. More recently, a two-stage fast recursive algorithm (TSFRA) combining forward selection and backward model refinement has been proposed to improve the compactness and generalization performance of the model. This paper proposes unified two-stage orthogonal least squares methods instead of the fast recursive-based methods. In contrast to the TSFRA, this paper derives a new simplified relationship between the forward and the backward stages to avoid repetitive computations using the inherent orthogonal properties of the least squares methods. Furthermore, a new term exchanging scheme for backward model refinement is introduced to reduce computational demand. Finally, given the error reduction ratio criterion, effective and efficient forward and backward subset selection procedures are proposed. Extensive examples are presented to demonstrate the improved model compactness constructed by the proposed technique in comparison with some popular methods.
Original languageEnglish
Pages (from-to)1608-1621
Number of pages14
JournalI E E E Transactions on Neural Networks and Learning Systems
DOIs
Publication statusPublished - 1 Aug 2014

Keywords

  • Neural networks, Forward selection, Backward refinement, Orthogonal least squares

Fingerprint

Dive into the research topics of 'Two-Stage Orthogonal Least Squares Methods for Neural Network Construction'. Together they form a unique fingerprint.

Cite this