Abstract
The learning dynamics close to the initial conditions of an on-line Hebbian ICA algorithm has been studied. For large input dimension the dynamics can be described by a diffusion equation.A surprisingly large number of examples and unusually low initial learning rate are required to avoid a stochastic trapping state near the initial conditions. Escape from this state results in symmetry breaking and the algorithm therefore avoids trapping in plateau-like fixed points which have been observed in other learning algorithms. © Springer-Verlag Berlin Heidelberg 2002.
Original language | English |
---|---|
Title of host publication | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)|Lect. Notes Comput. Sci. |
Publisher | Springer Nature |
Pages | 1112-1118 |
Number of pages | 6 |
Volume | 2415 |
ISBN (Print) | 9783540440741 |
Publication status | Published - 2002 |
Event | 2002 International Conference on Artificial Neural Networks, ICANN 2002 - Madrid Duration: 1 Jul 2002 → … http://dblp.uni-trier.de/db/conf/icann/icann2002.html#BasalygaR02http://dblp.uni-trier.de/rec/bibtex/conf/icann/BasalygaR02.xmlhttp://dblp.uni-trier.de/rec/bibtex/conf/icann/BasalygaR02 |
Publication series
Name | Lecture Notes in Computer Science |
---|
Conference
Conference | 2002 International Conference on Artificial Neural Networks, ICANN 2002 |
---|---|
City | Madrid |
Period | 1/07/02 → … |
Internet address |