Bayesian self-organising map for Gaussian mixtures

H. Yin, N. M. Allinson

    Research output: Contribution to journalArticlepeer-review

    Abstract

    A Bayesian self-organising map (BSOM) is proposed for learning mixtures of Gaussian distributions. It is derived naturally from minimising the Kullback-Leibler divergence between the data density and the neural model. The inferred posterior probabilities of the neurons replace the common Euclidean distance winning rule and define explicitly the neighbourhood function. Learning can be retained in a small but fixed neighbourhood of the winner. The BSOM in turn provides an insight into the role of neighbourhood functions used in the common SOM. A formal comparison between the BSOM and the expectation-maximisation (EM) algorithm is also presented, together with experimental results.
    Original languageEnglish
    Pages (from-to)234-240
    Number of pages6
    JournalIEE Proceedings: Vision, Image and Signal Processing
    Volume148
    Issue number4
    DOIs
    Publication statusPublished - Aug 2001

    Fingerprint

    Dive into the research topics of 'Bayesian self-organising map for Gaussian mixtures'. Together they form a unique fingerprint.

    Cite this