The goal of this study is to design and validate an innovative diagnostic tool to assess egocentric spatial ability, which is the ability to perceive and manipulate objects in space using a self-to-object frame of reference (e.g., imagining a change of perspective). This is important for successful navigation in large-scale 3D space, teleoperation, robotics, dentistry, and surgery. In contrast, allocentric spatial ability is the ability to imagine rotation of objects from a stationary perspective (e.g., mental rotation), and it predicts success in science and engineering. The existing assessments of spatial ability, usually administered in non-immersive environments, where an observer views himself from outside the screen, assess primarily allocentric spatial ability.
We designed an immersive 3D Virtual Reality Perspective-Taking Ability (3D VR-PTA) task, in which participants were shown an array of objects, asked to imagine taking the perspective of an avatar within the scene, and then asked to indicate the direction to a target object from this imagined perspective. In Experiment 1, 13 participants were administered 3D VR-PTA along with the two-dimensional (2D PTA) and 3D non-immersive (3D PTA) versions, and a number of allocentric spatial tasks. The analysis of pointing accuracy pattern suggests reliance on the egocentric system while encoding and manipulating stimuli in 3D VR-PTA vs. 2D PTA. In Experiment 2, 36 participants were administered 3D VR-PTA along with non-immersive spatial tasks (mental rotation, paper folding, 2D PTA) and realworld tasks where they were placed in a new real-world environment and tested on pointing direction tasks, shortcut finding tasks, and path integration tasks. 3D VR-PTA was found to be the best predictor of navigational performance followed by 2D PTA, while all other allocentric spatial ability tasks did not predict real world navigational performance.
Three-Dimensional Immersive Diagnostic Tool for Spatial Egocentric Ability
4 Views