Cargando…

The world within reach: An image database of reach-relevant environments

Near-scale spaces are a key component of our visual experience: Whether for work or for leisure, we spend much of our days immersed in, and acting upon, the world within reach. Here, we present the Reachspace Database, a novel stimulus set containing over 10,000 images depicting first person, motor-...

Descripción completa

Detalles Bibliográficos
Autores principales: Josephs, Emilie L., Zhao, Haoyun, Konkle, Talia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Association for Research in Vision and Ophthalmology 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8300055/
https://www.ncbi.nlm.nih.gov/pubmed/34289491
http://dx.doi.org/10.1167/jov.21.7.14
_version_ 1783726384590356480
author Josephs, Emilie L.
Zhao, Haoyun
Konkle, Talia
author_facet Josephs, Emilie L.
Zhao, Haoyun
Konkle, Talia
author_sort Josephs, Emilie L.
collection PubMed
description Near-scale spaces are a key component of our visual experience: Whether for work or for leisure, we spend much of our days immersed in, and acting upon, the world within reach. Here, we present the Reachspace Database, a novel stimulus set containing over 10,000 images depicting first person, motor-relevant views at an approximated reachable scale (hereafter “reachspaces”), which reflect the visual input that an agent would experience while performing a task with her hands. These images are divided into over 350 categories, based on a taxonomy we developed, which captures information relating to the identity of each reachspace, including the broader setting and room it is found in, the locus of interaction (e.g., kitchen counter, desk), and the specific action it affords. Summary analyses of the taxonomy labels in the database suggest a tight connection between activities and the spaces that support them: While a small number of rooms and interaction loci afford many diverse actions (e.g., workshops, tables), most reachspaces were relatively specialized, typically affording only one main activity (e.g., gas station pump, airplane cockpit, kitchen cutting board). Overall, this Reachspace Database represents a large sampling of reachable environments and provides a new resource to support behavioral and neural research into the visual representation of reach-relevant environments. The database is available for download on the Open Science Framework (osf.io/bfyxk/).
format Online
Article
Text
id pubmed-8300055
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher The Association for Research in Vision and Ophthalmology
record_format MEDLINE/PubMed
spelling pubmed-83000552021-07-28 The world within reach: An image database of reach-relevant environments Josephs, Emilie L. Zhao, Haoyun Konkle, Talia J Vis Methods Near-scale spaces are a key component of our visual experience: Whether for work or for leisure, we spend much of our days immersed in, and acting upon, the world within reach. Here, we present the Reachspace Database, a novel stimulus set containing over 10,000 images depicting first person, motor-relevant views at an approximated reachable scale (hereafter “reachspaces”), which reflect the visual input that an agent would experience while performing a task with her hands. These images are divided into over 350 categories, based on a taxonomy we developed, which captures information relating to the identity of each reachspace, including the broader setting and room it is found in, the locus of interaction (e.g., kitchen counter, desk), and the specific action it affords. Summary analyses of the taxonomy labels in the database suggest a tight connection between activities and the spaces that support them: While a small number of rooms and interaction loci afford many diverse actions (e.g., workshops, tables), most reachspaces were relatively specialized, typically affording only one main activity (e.g., gas station pump, airplane cockpit, kitchen cutting board). Overall, this Reachspace Database represents a large sampling of reachable environments and provides a new resource to support behavioral and neural research into the visual representation of reach-relevant environments. The database is available for download on the Open Science Framework (osf.io/bfyxk/). The Association for Research in Vision and Ophthalmology 2021-07-21 /pmc/articles/PMC8300055/ /pubmed/34289491 http://dx.doi.org/10.1167/jov.21.7.14 Text en Copyright 2021 The Authors https://creativecommons.org/licenses/by/4.0/This work is licensed under a Creative Commons Attribution 4.0 International License.
spellingShingle Methods
Josephs, Emilie L.
Zhao, Haoyun
Konkle, Talia
The world within reach: An image database of reach-relevant environments
title The world within reach: An image database of reach-relevant environments
title_full The world within reach: An image database of reach-relevant environments
title_fullStr The world within reach: An image database of reach-relevant environments
title_full_unstemmed The world within reach: An image database of reach-relevant environments
title_short The world within reach: An image database of reach-relevant environments
title_sort world within reach: an image database of reach-relevant environments
topic Methods
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8300055/
https://www.ncbi.nlm.nih.gov/pubmed/34289491
http://dx.doi.org/10.1167/jov.21.7.14
work_keys_str_mv AT josephsemiliel theworldwithinreachanimagedatabaseofreachrelevantenvironments
AT zhaohaoyun theworldwithinreachanimagedatabaseofreachrelevantenvironments
AT konkletalia theworldwithinreachanimagedatabaseofreachrelevantenvironments
AT josephsemiliel worldwithinreachanimagedatabaseofreachrelevantenvironments
AT zhaohaoyun worldwithinreachanimagedatabaseofreachrelevantenvironments
AT konkletalia worldwithinreachanimagedatabaseofreachrelevantenvironments