Abstract

Spatial cognitive processing is a crucial element of human cognition, intricately influencing our understanding of spatial environments. Despite varying definitions, researchers concur that spatial ability encompasses skills like generating, visualizing, memorizing, and transforming visual information—a fundamental aptitude for tasks requiring visual and spatial acumen. Spatial orientation is one such ability that utilizes egocentric spatial encoding and contributes to human spatial ability. This study focuses on the evaluation of spatial orientation ability through the Perspective-Taking Ability (PTA) test. This test gauges participants' capacity to envision a view from an alternative. Stimuli include 5-6 routine objects placed on the perimeter of a circle, and participants are asked to mentally position themselves at one object facing another object and point to a third object. Scores depend on the degree of deviation from the correct direction in sexagesimal degrees. This nuanced evaluation explores spatial orientation and comprehension of an environment from diverse viewpoints. The PTA test was digitalized and integrated into Virtual Reality (VR) environments created in Unity 3D to depict three scenarios. The first scenario was the control group that included an earth-like setting in which the gravitation vertical, idiotropic axis of a participant, and the visual axis are aligned. The second scenario of experiment group 1 simulated spatial conditions of microgravity in space, which lacks gravitational vertical and has statically misaligned visual and idiotropic axes. In the third scenario, the misalignment is dynamic in that it is constantly changing around X, Y, and Z axes over the test session. The three study conditions were administered to 230 participants through HTC Vive Pro Eye head-mounted displays (HMDs). Participants’ responses were collected using a programming script and analyzed to understand how participants’ performance on the PTA test tasks varied between the three conditions and how their age moderated this influence. Participants were categorized into age groups: 18-22, 23-27, 28-32, 33-37, and 38+. The Mann-Whitney U test indicated a significant difference in response accuracy of the participants aged 23-27, 33-37, and 38 and above, indicating distinctive performance between the three study conditions. This means that static and dynamic misalignment influenced spatial orientation performance. Conversely, participants aged 28-32 showed no significant difference between the three conditions, indicating no impacts of the misaligned idiotropic and visual axes. Based on the Kruskal-Wallis test results, the age groups of 18-22 and 38+ revealed significant accuracy differences, whereas the age group 23-27 had highly significant differences. Conversely, the age group 28-32 showed no significant accuracy difference, suggesting comparable performance, whereas the age group 33-37 showed a significant accuracy difference. Results indicate a statistically significant accuracy difference among age groups, suggesting age group moderating the influence of misaligned axes on PTA scores. The pairwise age group comparisons using the Dunn's Post Hoc Test showed significant differences in accuracy for the 23-27 age group compared to the 18-22, 28-32, and 33-37 age groups, revealing age-related variations in spatial accuracy. In conclusion, our research unveiled a profound connection between age and accuracy, demonstrating pronounced differences among age groups.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.