Building a family of neural networks using symmetry as a foundation

R. S. Neville, L. Zhao

    Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

    Abstract

    In order to perform a function mapping task, a neural network needs two supporting mechanisms: an input and an output training vector, and a training regime. A new approach is proposed to generating a family of neural networks for performing a set of related functions. Within a family, only one network needs to be trained to perform an input-output function mapping task and other networks can be derived from this trained base network without training. The base net thus acts as a generator of the derived nets. The proposed approach builds on three mathematical foundations: (1) symmetry for defining the relationship between functions; (2) weight transformations for generating a family of networks; (3) euclidian distance function for measuring the symmetric relationships between the related functions. The proposed approach provides a formal foundation for systemic information reuse in ANNs. ©2007 IEEE.
    Original languageEnglish
    Title of host publicationIEEE International Conference on Neural Networks - Conference Proceedings|IEEE Int. Conf. Neural. Netw. Conf. Proc.
    PublisherIEEE Computer Society
    Pages7-12
    Number of pages5
    ISBN (Print)142441380X, 9781424413805
    DOIs
    Publication statusPublished - 2007
    Event2007 International Joint Conference on Neural Networks, IJCNN 2007 - Orlando, FL
    Duration: 1 Jul 2007 → …

    Conference

    Conference2007 International Joint Conference on Neural Networks, IJCNN 2007
    CityOrlando, FL
    Period1/07/07 → …

    Keywords

    • Computer Science, Artificial Intelligence
    • Computer Science, Software
    • Engineering

    Fingerprint

    Dive into the research topics of 'Building a family of neural networks using symmetry as a foundation'. Together they form a unique fingerprint.

    Cite this