Cargando…

A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality

No generic function for the minicolumn – i.e., one that would apply equally well to all cortical areas and species – has yet been proposed. I propose that the minicolumn does have a generic functionality, which only becomes clear when seen in the context of the function of the higher-level, subsumin...

Descripción completa

Detalles Bibliográficos
Autor principal: Rinkus, Gerard J.
Formato: Texto
Lenguaje:English
Publicado: Frontiers Research Foundation 2010
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2889687/
https://www.ncbi.nlm.nih.gov/pubmed/20577587
http://dx.doi.org/10.3389/fnana.2010.00017
_version_ 1782182702490845184
author Rinkus, Gerard J.
author_facet Rinkus, Gerard J.
author_sort Rinkus, Gerard J.
collection PubMed
description No generic function for the minicolumn – i.e., one that would apply equally well to all cortical areas and species – has yet been proposed. I propose that the minicolumn does have a generic functionality, which only becomes clear when seen in the context of the function of the higher-level, subsuming unit, the macrocolumn. I propose that: (a) a macrocolumn's function is to store sparse distributed representations of its inputs and to be a recognizer of those inputs; and (b) the generic function of the minicolumn is to enforce macrocolumnar code sparseness. The minicolumn, defined here as a physically localized pool of ∼20 L2/3 pyramidals, does this by acting as a winner-take-all (WTA) competitive module, implying that macrocolumnar codes consist of ∼70 active L2/3 cells, assuming ∼70 minicolumns per macrocolumn. I describe an algorithm for activating these codes during both learning and retrievals, which causes more similar inputs to map to more highly intersecting codes, a property which yields ultra-fast (immediate, first-shot) storage and retrieval. The algorithm achieves this by adding an amount of randomness (noise) into the code selection process, which is inversely proportional to an input's familiarity. I propose a possible mapping of the algorithm onto cortical circuitry, and adduce evidence for a neuromodulatory implementation of this familiarity-contingent noise mechanism. The model is distinguished from other recent columnar cortical circuit models in proposing a generic minicolumnar function in which a group of cells within the minicolumn, the L2/3 pyramidals, compete (WTA) to be part of the sparse distributed macrocolumnar code.
format Text
id pubmed-2889687
institution National Center for Biotechnology Information
language English
publishDate 2010
publisher Frontiers Research Foundation
record_format MEDLINE/PubMed
spelling pubmed-28896872010-06-24 A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality Rinkus, Gerard J. Front Neuroanat Neuroscience No generic function for the minicolumn – i.e., one that would apply equally well to all cortical areas and species – has yet been proposed. I propose that the minicolumn does have a generic functionality, which only becomes clear when seen in the context of the function of the higher-level, subsuming unit, the macrocolumn. I propose that: (a) a macrocolumn's function is to store sparse distributed representations of its inputs and to be a recognizer of those inputs; and (b) the generic function of the minicolumn is to enforce macrocolumnar code sparseness. The minicolumn, defined here as a physically localized pool of ∼20 L2/3 pyramidals, does this by acting as a winner-take-all (WTA) competitive module, implying that macrocolumnar codes consist of ∼70 active L2/3 cells, assuming ∼70 minicolumns per macrocolumn. I describe an algorithm for activating these codes during both learning and retrievals, which causes more similar inputs to map to more highly intersecting codes, a property which yields ultra-fast (immediate, first-shot) storage and retrieval. The algorithm achieves this by adding an amount of randomness (noise) into the code selection process, which is inversely proportional to an input's familiarity. I propose a possible mapping of the algorithm onto cortical circuitry, and adduce evidence for a neuromodulatory implementation of this familiarity-contingent noise mechanism. The model is distinguished from other recent columnar cortical circuit models in proposing a generic minicolumnar function in which a group of cells within the minicolumn, the L2/3 pyramidals, compete (WTA) to be part of the sparse distributed macrocolumnar code. Frontiers Research Foundation 2010-06-02 /pmc/articles/PMC2889687/ /pubmed/20577587 http://dx.doi.org/10.3389/fnana.2010.00017 Text en Copyright © 2010 Rinkus. http://www.frontiersin.org/licenseagreement This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.
spellingShingle Neuroscience
Rinkus, Gerard J.
A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality
title A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality
title_full A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality
title_fullStr A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality
title_full_unstemmed A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality
title_short A Cortical Sparse Distributed Coding Model Linking Mini- and Macrocolumn-Scale Functionality
title_sort cortical sparse distributed coding model linking mini- and macrocolumn-scale functionality
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2889687/
https://www.ncbi.nlm.nih.gov/pubmed/20577587
http://dx.doi.org/10.3389/fnana.2010.00017
work_keys_str_mv AT rinkusgerardj acorticalsparsedistributedcodingmodellinkingminiandmacrocolumnscalefunctionality
AT rinkusgerardj corticalsparsedistributedcodingmodellinkingminiandmacrocolumnscalefunctionality