Abstract

Near-scale spaces are a key component of our visual experience: Whether for work or for leisure, we spend much of our days immersed in, and acting upon, the world within reach. Here, we present the Reachspace Database, a novel stimulus set containing over 10,000 images depicting first person, motor-relevant views at an approximated reachable scale (hereafter “reachspaces”), which reflect the visual input that an agent would experience while performing a task with her hands. These images are divided into over 350 categories, based on a taxonomy we developed, which captures information relating to the identity of each reachspace, including the broader setting and room it is found in, the locus of interaction (e.g., kitchen counter, desk), and the specific action it affords. Summary analyses of the taxonomy labels in the database suggest a tight connection between activities and the spaces that support them: While a small number of rooms and interaction loci afford many diverse actions (e.g., workshops, tables), most reachspaces were relatively specialized, typically affording only one main activity (e.g., gas station pump, airplane cockpit, kitchen cutting board). Overall, this Reachspace Database represents a large sampling of reachable environments and provides a new resource to support behavioral and neural research into the visual representation of reach-relevant environments. The database is available for download on the Open Science Framework (osf.io/bfyxk/).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call