Cargando…

THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior

Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here, we present THINGS-data, a multimodal collection of large-scale neuroimaging and behavioral datasets in humans, comprising den...

Descripción completa

Detalles Bibliográficos
Autores principales: Hebart, Martin N, Contier, Oliver, Teichmann, Lina, Rockter, Adam H, Zheng, Charles Y, Kidder, Alexis, Corriveau, Anna, Vaziri-Pashkam, Maryam, Baker, Chris I
Formato: Online Artículo Texto
Lenguaje:English
Publicado: eLife Sciences Publications, Ltd 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10038662/
https://www.ncbi.nlm.nih.gov/pubmed/36847339
http://dx.doi.org/10.7554/eLife.82580
_version_ 1784912129525022720
author Hebart, Martin N
Contier, Oliver
Teichmann, Lina
Rockter, Adam H
Zheng, Charles Y
Kidder, Alexis
Corriveau, Anna
Vaziri-Pashkam, Maryam
Baker, Chris I
author_facet Hebart, Martin N
Contier, Oliver
Teichmann, Lina
Rockter, Adam H
Zheng, Charles Y
Kidder, Alexis
Corriveau, Anna
Vaziri-Pashkam, Maryam
Baker, Chris I
author_sort Hebart, Martin N
collection PubMed
description Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here, we present THINGS-data, a multimodal collection of large-scale neuroimaging and behavioral datasets in humans, comprising densely sampled functional MRI and magnetoencephalographic recordings, as well as 4.70 million similarity judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly annotated objects, allowing for testing countless hypotheses at scale while assessing the reproducibility of previous findings. Beyond the unique insights promised by each individual dataset, the multimodality of THINGS-data allows combining datasets for a much broader view into object processing than previously possible. Our analyses demonstrate the high quality of the datasets and provide five examples of hypothesis-driven and data-driven applications. THINGS-data constitutes the core public release of the THINGS initiative (https://things-initiative.org) for bridging the gap between disciplines and the advancement of cognitive neuroscience.
format Online
Article
Text
id pubmed-10038662
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher eLife Sciences Publications, Ltd
record_format MEDLINE/PubMed
spelling pubmed-100386622023-03-25 THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior Hebart, Martin N Contier, Oliver Teichmann, Lina Rockter, Adam H Zheng, Charles Y Kidder, Alexis Corriveau, Anna Vaziri-Pashkam, Maryam Baker, Chris I eLife Neuroscience Understanding object representations requires a broad, comprehensive sampling of the objects in our visual world with dense measurements of brain activity and behavior. Here, we present THINGS-data, a multimodal collection of large-scale neuroimaging and behavioral datasets in humans, comprising densely sampled functional MRI and magnetoencephalographic recordings, as well as 4.70 million similarity judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly annotated objects, allowing for testing countless hypotheses at scale while assessing the reproducibility of previous findings. Beyond the unique insights promised by each individual dataset, the multimodality of THINGS-data allows combining datasets for a much broader view into object processing than previously possible. Our analyses demonstrate the high quality of the datasets and provide five examples of hypothesis-driven and data-driven applications. THINGS-data constitutes the core public release of the THINGS initiative (https://things-initiative.org) for bridging the gap between disciplines and the advancement of cognitive neuroscience. eLife Sciences Publications, Ltd 2023-02-27 /pmc/articles/PMC10038662/ /pubmed/36847339 http://dx.doi.org/10.7554/eLife.82580 Text en https://creativecommons.org/publicdomain/zero/1.0/This is an open-access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication (https://creativecommons.org/publicdomain/zero/1.0/) .
spellingShingle Neuroscience
Hebart, Martin N
Contier, Oliver
Teichmann, Lina
Rockter, Adam H
Zheng, Charles Y
Kidder, Alexis
Corriveau, Anna
Vaziri-Pashkam, Maryam
Baker, Chris I
THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior
title THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior
title_full THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior
title_fullStr THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior
title_full_unstemmed THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior
title_short THINGS-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior
title_sort things-data, a multimodal collection of large-scale datasets for investigating object representations in human brain and behavior
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10038662/
https://www.ncbi.nlm.nih.gov/pubmed/36847339
http://dx.doi.org/10.7554/eLife.82580
work_keys_str_mv AT hebartmartinn thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT contieroliver thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT teichmannlina thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT rockteradamh thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT zhengcharlesy thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT kidderalexis thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT corriveauanna thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT vaziripashkammaryam thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior
AT bakerchrisi thingsdataamultimodalcollectionoflargescaledatasetsforinvestigatingobjectrepresentationsinhumanbrainandbehavior