Abstract
Model performance and convergence rate are two key measures
for assessing the methods used in nonlinear system identification
using Radial Basis Function neural networks. A new extension of the
Newton algorithm is proposed to further improve these two aspects by
extending the results of recently proposed continuous forward algorithm
(CFA) and hybrid forward algorithm (HFA). Computational complexity
analysis confirms its efficiency, and numerical examples show that it
converges faster and potentially outperforms CFA and HFA.
for assessing the methods used in nonlinear system identification
using Radial Basis Function neural networks. A new extension of the
Newton algorithm is proposed to further improve these two aspects by
extending the results of recently proposed continuous forward algorithm
(CFA) and hybrid forward algorithm (HFA). Computational complexity
analysis confirms its efficiency, and numerical examples show that it
converges faster and potentially outperforms CFA and HFA.
Original language | English |
---|---|
Pages (from-to) | 2929-2933 |
Journal | IEEE Transactions on Automatic Control, |
Volume | 58 |
Issue number | 11 |
Publication status | Published - 18 Apr 2013 |
Keywords
- Newton method
- orthogonal least squares
- radial basis function (RBF)
- sum squared errors
- Convergence rate