Cargando…

Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models

Recent studies have applied dimensionality reduction methods to understand how the multi-dimensional structure of neural population activity gives rise to brain function. It is unclear, however, how the results obtained from dimensionality reduction generalize to recordings with larger numbers of ne...

Descripción completa

Detalles Bibliográficos
Autores principales: Williamson, Ryan C., Cowley, Benjamin R., Litwin-Kumar, Ashok, Doiron, Brent, Kohn, Adam, Smith, Matthew A., Yu, Byron M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5142778/
https://www.ncbi.nlm.nih.gov/pubmed/27926936
http://dx.doi.org/10.1371/journal.pcbi.1005141
_version_ 1782472825158762496
author Williamson, Ryan C.
Cowley, Benjamin R.
Litwin-Kumar, Ashok
Doiron, Brent
Kohn, Adam
Smith, Matthew A.
Yu, Byron M.
author_facet Williamson, Ryan C.
Cowley, Benjamin R.
Litwin-Kumar, Ashok
Doiron, Brent
Kohn, Adam
Smith, Matthew A.
Yu, Byron M.
author_sort Williamson, Ryan C.
collection PubMed
description Recent studies have applied dimensionality reduction methods to understand how the multi-dimensional structure of neural population activity gives rise to brain function. It is unclear, however, how the results obtained from dimensionality reduction generalize to recordings with larger numbers of neurons and trials or how these results relate to the underlying network structure. We address these questions by applying factor analysis to recordings in the visual cortex of non-human primates and to spiking network models that self-generate irregular activity through a balance of excitation and inhibition. We compared the scaling trends of two key outputs of dimensionality reduction—shared dimensionality and percent shared variance—with neuron and trial count. We found that the scaling properties of networks with non-clustered and clustered connectivity differed, and that the in vivo recordings were more consistent with the clustered network. Furthermore, recordings from tens of neurons were sufficient to identify the dominant modes of shared variability that generalize to larger portions of the network. These findings can help guide the interpretation of dimensionality reduction outputs in regimes of limited neuron and trial sampling and help relate these outputs to the underlying network structure.
format Online
Article
Text
id pubmed-5142778
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-51427782016-12-22 Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models Williamson, Ryan C. Cowley, Benjamin R. Litwin-Kumar, Ashok Doiron, Brent Kohn, Adam Smith, Matthew A. Yu, Byron M. PLoS Comput Biol Research Article Recent studies have applied dimensionality reduction methods to understand how the multi-dimensional structure of neural population activity gives rise to brain function. It is unclear, however, how the results obtained from dimensionality reduction generalize to recordings with larger numbers of neurons and trials or how these results relate to the underlying network structure. We address these questions by applying factor analysis to recordings in the visual cortex of non-human primates and to spiking network models that self-generate irregular activity through a balance of excitation and inhibition. We compared the scaling trends of two key outputs of dimensionality reduction—shared dimensionality and percent shared variance—with neuron and trial count. We found that the scaling properties of networks with non-clustered and clustered connectivity differed, and that the in vivo recordings were more consistent with the clustered network. Furthermore, recordings from tens of neurons were sufficient to identify the dominant modes of shared variability that generalize to larger portions of the network. These findings can help guide the interpretation of dimensionality reduction outputs in regimes of limited neuron and trial sampling and help relate these outputs to the underlying network structure. Public Library of Science 2016-12-07 /pmc/articles/PMC5142778/ /pubmed/27926936 http://dx.doi.org/10.1371/journal.pcbi.1005141 Text en © 2016 Williamson et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Williamson, Ryan C.
Cowley, Benjamin R.
Litwin-Kumar, Ashok
Doiron, Brent
Kohn, Adam
Smith, Matthew A.
Yu, Byron M.
Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models
title Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models
title_full Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models
title_fullStr Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models
title_full_unstemmed Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models
title_short Scaling Properties of Dimensionality Reduction for Neural Populations and Network Models
title_sort scaling properties of dimensionality reduction for neural populations and network models
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5142778/
https://www.ncbi.nlm.nih.gov/pubmed/27926936
http://dx.doi.org/10.1371/journal.pcbi.1005141
work_keys_str_mv AT williamsonryanc scalingpropertiesofdimensionalityreductionforneuralpopulationsandnetworkmodels
AT cowleybenjaminr scalingpropertiesofdimensionalityreductionforneuralpopulationsandnetworkmodels
AT litwinkumarashok scalingpropertiesofdimensionalityreductionforneuralpopulationsandnetworkmodels
AT doironbrent scalingpropertiesofdimensionalityreductionforneuralpopulationsandnetworkmodels
AT kohnadam scalingpropertiesofdimensionalityreductionforneuralpopulationsandnetworkmodels
AT smithmatthewa scalingpropertiesofdimensionalityreductionforneuralpopulationsandnetworkmodels
AT yubyronm scalingpropertiesofdimensionalityreductionforneuralpopulationsandnetworkmodels