Abstract
A Bayesian self-organising map (BSOM) is proposed for learning mixtures of Gaussian distributions. It is derived naturally from minimising the Kullback-Leibler divergence between the data density and the neural model. The inferred posterior probabilities of the neurons replace the common Euclidean distance winning rule and define explicitly the neighbourhood function. Learning can be retained in a small but fixed neighbourhood of the winner. The BSOM in turn provides an insight into the role of neighbourhood functions used in the common SOM. A formal comparison between the BSOM and the expectation-maximisation (EM) algorithm is also presented, together with experimental results.
Original language | English |
---|---|
Pages (from-to) | 234-240 |
Number of pages | 6 |
Journal | IEE Proceedings: Vision, Image and Signal Processing |
Volume | 148 |
Issue number | 4 |
DOIs | |
Publication status | Published - Aug 2001 |