Cargando…
Shared spatiotemporal category representations in biological and artificial deep neural networks
Visual scene category representations emerge very rapidly, yet the computational transformations that enable such invariant categorizations remain elusive. Deep convolutional neural networks (CNNs) perform visual categorization at near human-level accuracy using a feedforward architecture, providing...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6075788/ https://www.ncbi.nlm.nih.gov/pubmed/30040821 http://dx.doi.org/10.1371/journal.pcbi.1006327 |
_version_ | 1783344617030156288 |
---|---|
author | Greene, Michelle R. Hansen, Bruce C. |
author_facet | Greene, Michelle R. Hansen, Bruce C. |
author_sort | Greene, Michelle R. |
collection | PubMed |
description | Visual scene category representations emerge very rapidly, yet the computational transformations that enable such invariant categorizations remain elusive. Deep convolutional neural networks (CNNs) perform visual categorization at near human-level accuracy using a feedforward architecture, providing neuroscientists with the opportunity to assess one successful series of representational transformations that enable categorization in silico. The goal of the current study is to assess the extent to which sequential scene category representations built by a CNN map onto those built in the human brain as assessed by high-density, time-resolved event-related potentials (ERPs). We found correspondence both over time and across the scalp: earlier (0–200 ms) ERP activity was best explained by early CNN layers at all electrodes. Although later activity at most electrode sites corresponded to earlier CNN layers, activity in right occipito-temporal electrodes was best explained by the later, fully-connected layers of the CNN around 225 ms post-stimulus, along with similar patterns in frontal electrodes. Taken together, these results suggest that the emergence of scene category representations develop through a dynamic interplay between early activity over occipital electrodes as well as later activity over temporal and frontal electrodes. |
format | Online Article Text |
id | pubmed-6075788 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-60757882018-08-28 Shared spatiotemporal category representations in biological and artificial deep neural networks Greene, Michelle R. Hansen, Bruce C. PLoS Comput Biol Research Article Visual scene category representations emerge very rapidly, yet the computational transformations that enable such invariant categorizations remain elusive. Deep convolutional neural networks (CNNs) perform visual categorization at near human-level accuracy using a feedforward architecture, providing neuroscientists with the opportunity to assess one successful series of representational transformations that enable categorization in silico. The goal of the current study is to assess the extent to which sequential scene category representations built by a CNN map onto those built in the human brain as assessed by high-density, time-resolved event-related potentials (ERPs). We found correspondence both over time and across the scalp: earlier (0–200 ms) ERP activity was best explained by early CNN layers at all electrodes. Although later activity at most electrode sites corresponded to earlier CNN layers, activity in right occipito-temporal electrodes was best explained by the later, fully-connected layers of the CNN around 225 ms post-stimulus, along with similar patterns in frontal electrodes. Taken together, these results suggest that the emergence of scene category representations develop through a dynamic interplay between early activity over occipital electrodes as well as later activity over temporal and frontal electrodes. Public Library of Science 2018-07-24 /pmc/articles/PMC6075788/ /pubmed/30040821 http://dx.doi.org/10.1371/journal.pcbi.1006327 Text en © 2018 Greene, Hansen http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Greene, Michelle R. Hansen, Bruce C. Shared spatiotemporal category representations in biological and artificial deep neural networks |
title | Shared spatiotemporal category representations in biological and artificial deep neural networks |
title_full | Shared spatiotemporal category representations in biological and artificial deep neural networks |
title_fullStr | Shared spatiotemporal category representations in biological and artificial deep neural networks |
title_full_unstemmed | Shared spatiotemporal category representations in biological and artificial deep neural networks |
title_short | Shared spatiotemporal category representations in biological and artificial deep neural networks |
title_sort | shared spatiotemporal category representations in biological and artificial deep neural networks |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6075788/ https://www.ncbi.nlm.nih.gov/pubmed/30040821 http://dx.doi.org/10.1371/journal.pcbi.1006327 |
work_keys_str_mv | AT greenemicheller sharedspatiotemporalcategoryrepresentationsinbiologicalandartificialdeepneuralnetworks AT hansenbrucec sharedspatiotemporalcategoryrepresentationsinbiologicalandartificialdeepneuralnetworks |