Abstract
Humans tend to organize their knowledge into hierarchies, because searches are efficient when proceeding downward in the tree-like structures. Similarly, many autonomous robots also contain some form of hierarchical knowledge. They may learn knowledge from their experiences through interaction with human users. However, it is difficult to find a common ground between robots and humans in a low level experience. Thus, their interaction must take place at the semantic level rather than at the perceptual level, and robots need to organize perceptual experiences into hierarchies for themselves. This paper presents an unsupervised method to build view-based perceptual hierarchies using hierarchical Nearest Neighbor Graphs (hNNGs), which combine most of the interesting features of both Nearest Neighbor Graphs (NNGs) and self-balancing trees. An incremental construction algorithm is developed to build and maintain the perceptual hierarchies. The paper describes the details of the data representations and the algorithms of hNNGs.
Original language | English |
---|---|
Title of host publication | International Conference on Neural Information Processing ICONIP 2015 |
Subtitle of host publication | Neural Information Processing |
Publisher | Springer Nature |
Pages | 646-655 |
Number of pages | 10 |
ISBN (Electronic) | 978-3-319-26535-3 |
ISBN (Print) | 978-3-319-26534-6 |
DOIs | |
Publication status | E-pub ahead of print - 10 Nov 2015 |