Gradient-descent-based learning in memristive crossbar arrays

Manu V Nair, Piotr Dudek

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    This paper describes techniques to implement gradient-descent-based machine learning algorithms on crossbar arrays made of memristors or other analog memory devices. We introduce the Unregulated Step Descent (USD) algorithm, which is an approximation of the steepest descent algorithm, and discuss how it addresses various hardware implementation issues. We discuss the effect of device parameters and their variability on performance of the algorithm by using artificially generated and real-world datasets. In addition to providing insights on the effect of device parameters on learning, we illustrate how the USD algorithm partially offsets the effect of device variability. Finally, we discuss how the USD algorithm can be implemented in crossbar arrays using a simple 4-phase training scheme. The method allows parallel update of crossbar memory elements and reduces the hardware cost and complexity of the training architecture significantly.
    Original languageEnglish
    Title of host publicationInternational Joint Conference on Neural Networks, IJCNN 2015
    PublisherIEEE
    Number of pages7
    ISBN (Electronic)978-1-4799-1960-4
    DOIs
    Publication statusPublished - Jul 2015
    EventInternational Joint Conference on Neural Networks, IJCNN 2015 -
    Duration: 1 Jan 1824 → …

    Conference

    ConferenceInternational Joint Conference on Neural Networks, IJCNN 2015
    Period1/01/24 → …

    Fingerprint

    Dive into the research topics of 'Gradient-descent-based learning in memristive crossbar arrays'. Together they form a unique fingerprint.

    Cite this