Cargando…

Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification

Generalized linear models (GLMs) represent a popular choice for the probabilistic characterization of neural spike responses. While GLMs are attractive for their computational tractability, they also impose strong assumptions and thus only allow for a limited range of stimulus-response relationships...

Descripción completa

Detalles Bibliográficos
Autores principales: Theis, Lucas, Chagas, Andrè Maia, Arnstein, Daniel, Schwarz, Cornelius, Bethge, Matthias
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2013
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3836720/
https://www.ncbi.nlm.nih.gov/pubmed/24278006
http://dx.doi.org/10.1371/journal.pcbi.1003356
_version_ 1782292335383543808
author Theis, Lucas
Chagas, Andrè Maia
Arnstein, Daniel
Schwarz, Cornelius
Bethge, Matthias
author_facet Theis, Lucas
Chagas, Andrè Maia
Arnstein, Daniel
Schwarz, Cornelius
Bethge, Matthias
author_sort Theis, Lucas
collection PubMed
description Generalized linear models (GLMs) represent a popular choice for the probabilistic characterization of neural spike responses. While GLMs are attractive for their computational tractability, they also impose strong assumptions and thus only allow for a limited range of stimulus-response relationships to be discovered. Alternative approaches exist that make only very weak assumptions but scale poorly to high-dimensional stimulus spaces. Here we seek an approach which can gracefully interpolate between the two extremes. We extend two frequently used special cases of the GLM—a linear and a quadratic model—by assuming that the spike-triggered and non-spike-triggered distributions can be adequately represented using Gaussian mixtures. Because we derive the model from a generative perspective, its components are easy to interpret as they correspond to, for example, the spike-triggered distribution and the interspike interval distribution. The model is able to capture complex dependencies on high-dimensional stimuli with far fewer parameters than other approaches such as histogram-based methods. The added flexibility comes at the cost of a non-concave log-likelihood. We show that in practice this does not have to be an issue and the mixture-based model is able to outperform generalized linear and quadratic models.
format Online
Article
Text
id pubmed-3836720
institution National Center for Biotechnology Information
language English
publishDate 2013
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-38367202013-11-25 Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification Theis, Lucas Chagas, Andrè Maia Arnstein, Daniel Schwarz, Cornelius Bethge, Matthias PLoS Comput Biol Research Article Generalized linear models (GLMs) represent a popular choice for the probabilistic characterization of neural spike responses. While GLMs are attractive for their computational tractability, they also impose strong assumptions and thus only allow for a limited range of stimulus-response relationships to be discovered. Alternative approaches exist that make only very weak assumptions but scale poorly to high-dimensional stimulus spaces. Here we seek an approach which can gracefully interpolate between the two extremes. We extend two frequently used special cases of the GLM—a linear and a quadratic model—by assuming that the spike-triggered and non-spike-triggered distributions can be adequately represented using Gaussian mixtures. Because we derive the model from a generative perspective, its components are easy to interpret as they correspond to, for example, the spike-triggered distribution and the interspike interval distribution. The model is able to capture complex dependencies on high-dimensional stimuli with far fewer parameters than other approaches such as histogram-based methods. The added flexibility comes at the cost of a non-concave log-likelihood. We show that in practice this does not have to be an issue and the mixture-based model is able to outperform generalized linear and quadratic models. Public Library of Science 2013-11-21 /pmc/articles/PMC3836720/ /pubmed/24278006 http://dx.doi.org/10.1371/journal.pcbi.1003356 Text en © 2013 Theis et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Theis, Lucas
Chagas, Andrè Maia
Arnstein, Daniel
Schwarz, Cornelius
Bethge, Matthias
Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification
title Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification
title_full Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification
title_fullStr Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification
title_full_unstemmed Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification
title_short Beyond GLMs: A Generative Mixture Modeling Approach to Neural System Identification
title_sort beyond glms: a generative mixture modeling approach to neural system identification
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3836720/
https://www.ncbi.nlm.nih.gov/pubmed/24278006
http://dx.doi.org/10.1371/journal.pcbi.1003356
work_keys_str_mv AT theislucas beyondglmsagenerativemixturemodelingapproachtoneuralsystemidentification
AT chagasandremaia beyondglmsagenerativemixturemodelingapproachtoneuralsystemidentification
AT arnsteindaniel beyondglmsagenerativemixturemodelingapproachtoneuralsystemidentification
AT schwarzcornelius beyondglmsagenerativemixturemodelingapproachtoneuralsystemidentification
AT bethgematthias beyondglmsagenerativemixturemodelingapproachtoneuralsystemidentification