Toward joint approximate inference of visual quantities on cellular processor arrays

Julien N.P. Martel, Miguel Chau, Piotr Dudek, Matthew Cook

    Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

    Abstract

    The interacting visual maps (IVM) algorithm introduced in [1] is able to perform the joint approximate inference of several visual quantities such as optic-flow, gray-level intensities and ego-motion, using a sparse input coming from a neuromorphic dynamic vision sensor (DVS). We show that features of the model such as the intrinsic parallelism and distributed nature of its computation make it a natural candidate to benefit from the cellular processor array (CPA) hardware architecture. We have now implemented the IVM algorithm on a general-purpose CPA simulator, and here we present results of our simulations and demonstrate that the IVM algorithm indeed naturally fits the CPA architecture. Our work indicates that extended versions of the IVM algorithm could benefit greatly from a dedicated hardware implementation, eventually yielding a high speed, low power visual odometry chip.
    Original languageEnglish
    Title of host publicationIEEE International Symposium on Circuits and Systems, ISCAS 2015
    PublisherIEEE
    Pages2061-2064
    ISBN (Electronic)978-1-4799-8391-9
    DOIs
    Publication statusPublished - 2015
    EventIEEE International Symposium on Circuits and Systems, ISCAS 2015 - Lisbon, Portugal
    Duration: 24 May 201527 May 2015
    http://www.scopus.com/inward/record.url?eid=2-s2.0-84946225388&partnerID=40&md5=58467212fe9944b83fe45c3ea4058d6b

    Conference

    ConferenceIEEE International Symposium on Circuits and Systems, ISCAS 2015
    Country/TerritoryPortugal
    CityLisbon
    Period24/05/1527/05/15
    Internet address

    Fingerprint

    Dive into the research topics of 'Toward joint approximate inference of visual quantities on cellular processor arrays'. Together they form a unique fingerprint.

    Cite this