Emotional similarity refers to the tendency to group stimuli together according to the feelings they evoke in us. The study of emotional similarity is relevant for the semantic memory research and overgeneralisation bias in anxiety disorders and may have impact on psychological well-being. Most of the studies on similarity have focused on non-emotional stimuli, and fewer on simple stimuli (e.g., shapes, objects) that acquired an emotional value following fear conditioning. Very little is known about what makes us perceive real-life emotional experiences as similar. We assumed a similarity space of several integrated dimensions, with emotions as the most influential. We predicted emotional stimuli to be judged as more similar to each other than neutral stimuli, as they shared low scores in valence and high scores in arousal, and thus they were more salient. We also expected this to be underpinned by higher similarity in the neural activation patterns of emotional than neutral stimuli. We combined different similarity judgements tasks and fMRI to test our hypotheses, and analysed the data using RSA. Our results suggest an important role of emotion on similarity perception. Even though two expressions of the same person were objectively more similar to each other than the faces of two different individuals who expressed the same emotion, participants judged both types of face pairs to be just as similar to each other. However, the similarity between faces expressing similar emotions was lower than between neutral faces. In addition, we found no differences between images of the two emotional than the two neutral categories of real-life events. Similar findings were replicated using stimuli that acquired an emotional value after aversive conditioning, suggesting that emotion is as relevant as visual and semantic dimensions in perceived similarity. Despite this equivalence in similarity perception, emotions were more influential in the neural similarity space, resulting in higher similarity among the neural representations of emotional compared to neutral stimuli. We observed this in brain clusters located in the ventral visual stream underlying semantic processing and categorisation, and in regions involved in affect representations (i.e., precuneus, insula) and modulation (i.e., dorsal anterior cingulate cortex, dorsomedial prefrontal cortex). This pattern of findings suggest that emotions might trigger local (within a brain region) and distant (between brain regions) synchronisation processes, such that a stable mental representation, which encoded the ârelevanceâ of the stimulus, emerges and is shared among emotional experiences.
|Date of Award||1 Aug 2022|
- The University of Manchester
|Supervisor||Gorana Pobric (Supervisor) & Deborah Talmi (Supervisor)|