Investigating distortions in perceptual stability during different self-movements using Virtual Reality

Paul A. Warren, Graham Bell, Yu Li

Research output: Contribution to journalArticlepeer-review

Abstract

Using immersive Virtual Reality (the HTC Vive Head Mounted Display) we measured both bias and sensitivity when making judgements about scene stability of a target object during both active (self-propelled) and passive (experimenter-propelled) observer movements. This was repeated in the same group of 16 participants for three different observer-target movement conditions in which the instability of a target was yoked to the movement of the observer. We found that in all movement conditions that the target needed to move with (in the same direction) as the participant to be perceived as scene-stable. Consistent with the presence of additional available information (efference copy) about self-movement during active conditions, biases were smaller and sensitivities to instability were higher in these relative to passive conditions. However, the presence of efference copy was clearly not sufficient to completely eliminate the bias and we suggest that the presence of additional visual information about self-movement is also critical. We found some (albeit limited) evidence for correlation between appropriate metrics across different movement conditions. These results extend previous findings, providing evidence for consistency of biases across different movement types, suggestive of common processing underpinning perceptual stability judgements.
Original languageEnglish
JournalPerception
Early online date9 Aug 2022
DOIs
Publication statusPublished - 9 Aug 2022

Fingerprint

Dive into the research topics of 'Investigating distortions in perceptual stability during different self-movements using Virtual Reality'. Together they form a unique fingerprint.

Cite this