Single-trial decoding of bistable perception based on sparse nonnegative tensor decomposition

Hualou Liang, Zhisong Wang, Alexander Maier, Nikos K. Logothetis

    Research output: Contribution to journalArticlepeer-review

    Abstract

    The study of the neuronal correlates of the spontaneous alternation in perception elicited by bistable visual stimuli is promising for understanding the mechanism of neural information processing and the neural basis of visual perception and perceptual decision-making. In this paper, we develop a sparse nonnegative tensor factorization-(NTF)-based method to extract features from the local field potential (LFP), collected from the middle temporal (MT) visual cortex in a macaque monkey, for decoding its bistable structure-from-motion (SFM) perception. We apply the feature extraction approach to the multichannel time-frequency representation of the intracortical LFP data. The advantages of the sparse NTF-based feature extraction approach lies in its capability to yield components common across the space, time, and frequency domains yet discriminative across different conditions without prior knowledge of the discriminating frequency bands and temporal windows for a specific subject. We employ the support vector machines (SVMs) classifier based on the features of the NTF components for single-trial decoding the reported perception. Our results suggest that although other bands also have certain discriminability, the gamma band feature carries the most discriminative information for bistable perception, and that imposing the sparseness constraints on the nonnegative tensor factorization improves extraction of this feature.
    Original languageEnglish
    Article number642387
    JournalComputational Intelligence and Neuroscience
    Volume2008
    DOIs
    Publication statusPublished - 2008

    Fingerprint

    Dive into the research topics of 'Single-trial decoding of bistable perception based on sparse nonnegative tensor decomposition'. Together they form a unique fingerprint.

    Cite this