Cargando…
A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production
The present report examines the coinciding results of two study groups each presenting a power-of-two function to describe network structures underlying perceptual processes in one case and word production during verbal fluency tasks in the other. The former is theorized as neural cliques organized...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7485382/ https://www.ncbi.nlm.nih.gov/pubmed/32982704 http://dx.doi.org/10.3389/fnsys.2020.00058 |
_version_ | 1783581136855760896 |
---|---|
author | Fromm, Ortwin Klostermann, Fabian Ehlen, Felicitas |
author_facet | Fromm, Ortwin Klostermann, Fabian Ehlen, Felicitas |
author_sort | Fromm, Ortwin |
collection | PubMed |
description | The present report examines the coinciding results of two study groups each presenting a power-of-two function to describe network structures underlying perceptual processes in one case and word production during verbal fluency tasks in the other. The former is theorized as neural cliques organized according to the function N = 2(i) − 1, whereas the latter assumes word conglomerations thinkable as tuples following the function N = 2(i). Both theories assume the innate optimization of energy efficiency to cause the specific connectivity structure. The vast resemblance between both formulae motivated the development of a common formulation. This was obtained by using a vector space model, in which the configuration of neural cliques or connected words is represented by a N-dimensional state vector. A further analysis of the model showed that the entire time course of word production could be derived using basically one single minimal transformation-matrix. This again seems in line with the principle of maximum energy efficiency. |
format | Online Article Text |
id | pubmed-7485382 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-74853822020-09-24 A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production Fromm, Ortwin Klostermann, Fabian Ehlen, Felicitas Front Syst Neurosci Neuroscience The present report examines the coinciding results of two study groups each presenting a power-of-two function to describe network structures underlying perceptual processes in one case and word production during verbal fluency tasks in the other. The former is theorized as neural cliques organized according to the function N = 2(i) − 1, whereas the latter assumes word conglomerations thinkable as tuples following the function N = 2(i). Both theories assume the innate optimization of energy efficiency to cause the specific connectivity structure. The vast resemblance between both formulae motivated the development of a common formulation. This was obtained by using a vector space model, in which the configuration of neural cliques or connected words is represented by a N-dimensional state vector. A further analysis of the model showed that the entire time course of word production could be derived using basically one single minimal transformation-matrix. This again seems in line with the principle of maximum energy efficiency. Frontiers Media S.A. 2020-08-28 /pmc/articles/PMC7485382/ /pubmed/32982704 http://dx.doi.org/10.3389/fnsys.2020.00058 Text en Copyright © 2020 Fromm, Klostermann and Ehlen. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Fromm, Ortwin Klostermann, Fabian Ehlen, Felicitas A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production |
title | A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production |
title_full | A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production |
title_fullStr | A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production |
title_full_unstemmed | A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production |
title_short | A Vector Space Model for Neural Network Functions: Inspirations From Similarities Between the Theory of Connectivity and the Logarithmic Time Course of Word Production |
title_sort | vector space model for neural network functions: inspirations from similarities between the theory of connectivity and the logarithmic time course of word production |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7485382/ https://www.ncbi.nlm.nih.gov/pubmed/32982704 http://dx.doi.org/10.3389/fnsys.2020.00058 |
work_keys_str_mv | AT frommortwin avectorspacemodelforneuralnetworkfunctionsinspirationsfromsimilaritiesbetweenthetheoryofconnectivityandthelogarithmictimecourseofwordproduction AT klostermannfabian avectorspacemodelforneuralnetworkfunctionsinspirationsfromsimilaritiesbetweenthetheoryofconnectivityandthelogarithmictimecourseofwordproduction AT ehlenfelicitas avectorspacemodelforneuralnetworkfunctionsinspirationsfromsimilaritiesbetweenthetheoryofconnectivityandthelogarithmictimecourseofwordproduction AT frommortwin vectorspacemodelforneuralnetworkfunctionsinspirationsfromsimilaritiesbetweenthetheoryofconnectivityandthelogarithmictimecourseofwordproduction AT klostermannfabian vectorspacemodelforneuralnetworkfunctionsinspirationsfromsimilaritiesbetweenthetheoryofconnectivityandthelogarithmictimecourseofwordproduction AT ehlenfelicitas vectorspacemodelforneuralnetworkfunctionsinspirationsfromsimilaritiesbetweenthetheoryofconnectivityandthelogarithmictimecourseofwordproduction |