The functional role of neural oscillations in non-verbal emotional communication

Ashley Symons, Wael El-Deredy, Michael Schwartze, Sonja Kotz

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Effective interpersonal communication depends on the ability to perceive and interpret nonverbal emotional expressions from multiple sensory modalities. Current theoretical models propose that visual and auditory emotion perception involves a network of brain regions including the primary sensory cortices, the superior temporal sulcus (STS), and orbitofrontal cortex (OFC). However, relatively little is known about how the dynamic interplay between these regions gives rise to the perception of emotions. In recent years, there has been increasing recognition of the importance of neural oscillations in mediating neural communication within and between functional neural networks. Here we review studies investigating changes in oscillatory activity during the perception of visual, auditory, and audiovisual emotional expressions, and aim to characterize the functional role of neural oscillations in nonverbal emotion perception. Findings from the reviewed literature suggest that theta band oscillations most consistently differentiate between emotional and neutral expressions. While early theta synchronization appears to reflect the initial encoding of emotionally salient sensory information, later fronto-central theta synchronization may reflect the further integration of sensory information with internal representations. Additionally, gamma synchronization reflects facilitated sensory binding of emotional expressions within regions such as the OFC, STS, and, potentially, the amygdala. However, the evidence is more ambiguous when it comes to the role of oscillations within the alpha and beta frequencies, which vary as a function of modality (or modalities), presence or absence of predictive information, and attentional or task demands. Thus, the synchronization of neural oscillations within specific frequency bands mediates the rapid detection, integration, and evaluation of emotional expressions. Moreover, the functional coupling of oscillatory activity across multiples frequency bands supports a predictive coding model of multisensory emotion perception in which emotional facial and body expressions facilitate the processing of emotional vocalizations.

    Original languageEnglish
    Article number239
    JournalFrontiers in Human Neuroscience
    Volume10
    Issue numberMAY2016
    DOIs
    Publication statusPublished - 25 May 2016

    Keywords

    • Cross-modal prediction
    • Emotion
    • Multisensory
    • Neural oscillations
    • Nonverbal communication

    Fingerprint

    Dive into the research topics of 'The functional role of neural oscillations in non-verbal emotional communication'. Together they form a unique fingerprint.

    Cite this