Perspective taking as virtual navigation? Perceptual simulation of what others see reflects their location in space but not their gaze

Eleanor Ward*, Giorgio Ganis, Katrina L. McDonough, Patric Bach

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

14 Citations (Scopus)

Abstract

Other peoples' (imagined) visual perspectives are represented perceptually in a similar way to our own, and can drive bottom-up processes in the same way as own perceptual input (Ward, Ganis, & Bach, 2019). Here we test directly whether visual perspective taking is driven by where another person is looking, or whether these perceptual simulations represent their position in space more generally. Across two experiments, we asked participants to identify whether alphanumeric characters, presented at one of eight possible orientations away from upright, were presented normally, or in their mirror-inverted form (e.g. “R” vs. “Я”). In some scenes, a person would appear sitting to the left or the right of the participant. We manipulated either between-trials (Experiment 1) or between-subjects (Experiment 2), the gaze-direction of the inserted person, such that they either (1) looked towards the to-be-judged item, (2) averted their gaze away from the participant, or (3) gazed out towards the participant (Exp. 2 only). In the absence of another person, we replicated the well-established mental rotation effect, where recognition of items becomes slower the more items are oriented away from upright (e.g. Shepard and Meltzer, 1971). Crucially, in both experiments and in all conditions, this response pattern changed when another person was inserted into the scene. People spontaneously took the perspective of the other person and made faster judgements about the presented items in their presence if the characters were oriented towards upright to them. The gaze direction of this other person did not influence these effects. We propose that visual perspective taking is therefore a general spatial-navigational ability, allowing us to calculate more easily how a scene would (in principle) look from another position in space, and that such calculations reflect the spatial location of another person, but not their gaze.

Original languageEnglish
Article number104241
Number of pages12
JournalCognition
Volume199
Early online date24 Feb 2020
DOIs
Publication statusPublished - Jun 2020

Bibliographical note

Acknowledgements
We thank the members of the Action Prediction Lab, Plymouth University, (www.actionprediction.org) for discussion and comment on an earlier version of this article. Eleanor Ward was funded by a PhD student grant from the University of Plymouth, United Kingdom.

Author contributions
EW and PB designed the experiment with GG and KM. EW programmed the study, prepared the stimuli and collected all data with KM. EW and PB and GG analysed the data. EW, PB, GG wrote the manuscript.

Declaration of competing interest
The authors declare no competing interests.

Data available at https://osf.io/2cxpr/

Keywords

  • Gaze cuing
  • Mental imagery
  • Mental rotation
  • Navigation
  • Perceptual simulation
  • Visual perspective taking
  • SOCIAL COGNITION
  • SELF
  • AGENCY
  • DISSOCIATION
  • JOINT ATTENTION
  • MENTAL ROTATION
  • TRANSFORMATIONS
  • FMRI

Fingerprint

Dive into the research topics of 'Perspective taking as virtual navigation? Perceptual simulation of what others see reflects their location in space but not their gaze'. Together they form a unique fingerprint.

Cite this