Towards complex features: Competitive receptive fields in unsupervised deep networks

Richard Hankins, Yao Peng, Hujun Yin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper we propose a simple unsupervised approach to learning higher order features. This model is based on the recent success of lightweight approaches such as SOMNet and PCANet to the challenging task of image classification. Contrary to the more complex deep learning models such as convolutional neural networks (CNNs), these methods use naive algorithms to model the input distribution. Our endeavour focuses on the self-organizing map (SOM) based method and extends it by incorporating a competitive connection layer between filter learning stages. This simple addition encourages the second filter learning stage to learn complex combinations of first layer filters and simultaneously decreases channel depth. This approach to learning complex representations offers a competitive alternative to common deep learning models whilst maintaining an efficient framework. We test our proposed approach on the popular MNIST and challenging CIFAR-10 datasets.
Original languageEnglish
Title of host publicationIDEAL 2018
Subtitle of host publicationInternational Conference on Intelligent Data Engineering and Automated Learning
Pages838-848
Number of pages11
Publication statusPublished - 2018
EventIntelligent Data Engineering and Automated Learning - Madrid, Spain
Duration: 21 Nov 201823 Nov 2018
https://aida.ii.uam.es/ideal2018/#!/main

Conference

ConferenceIntelligent Data Engineering and Automated Learning
Abbreviated titleIDEAL
Country/TerritorySpain
CityMadrid
Period21/11/1823/11/18
Internet address

Keywords

  • Self-organising maps
  • Unsupervised learning
  • Deep learning
  • Representation learning
  • Receptive fields
  • Pooling

Fingerprint

Dive into the research topics of 'Towards complex features: Competitive receptive fields in unsupervised deep networks'. Together they form a unique fingerprint.

Cite this