Cargando…
Sparsey™: event recognition via deep hierarchical sparse distributed codes
The visual cortex's hierarchical, multi-level organization is captured in many biologically inspired computational vision models, the general idea being that progressively larger scale (spatially/temporally) and more complex visual features are represented in progressively higher areas. However...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2014
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4266026/ https://www.ncbi.nlm.nih.gov/pubmed/25566046 http://dx.doi.org/10.3389/fncom.2014.00160 |
_version_ | 1782348966754516992 |
---|---|
author | Rinkus, Gerard J. |
author_facet | Rinkus, Gerard J. |
author_sort | Rinkus, Gerard J. |
collection | PubMed |
description | The visual cortex's hierarchical, multi-level organization is captured in many biologically inspired computational vision models, the general idea being that progressively larger scale (spatially/temporally) and more complex visual features are represented in progressively higher areas. However, most earlier models use localist representations (codes) in each representational field (which we equate with the cortical macrocolumn, “mac”), at each level. In localism, each represented feature/concept/event (hereinafter “item”) is coded by a single unit. The model we describe, Sparsey, is hierarchical as well but crucially, it uses sparse distributed coding (SDC) in every mac in all levels. In SDC, each represented item is coded by a small subset of the mac's units. The SDCs of different items can overlap and the size of overlap between items can be used to represent their similarity. The difference between localism and SDC is crucial because SDC allows the two essential operations of associative memory, storing a new item and retrieving the best-matching stored item, to be done in fixed time for the life of the model. Since the model's core algorithm, which does both storage and retrieval (inference), makes a single pass over all macs on each time step, the overall model's storage/retrieval operation is also fixed-time, a criterion we consider essential for scalability to the huge (“Big Data”) problems. A 2010 paper described a nonhierarchical version of this model in the context of purely spatial pattern processing. Here, we elaborate a fully hierarchical model (arbitrary numbers of levels and macs per level), describing novel model principles like progressive critical periods, dynamic modulation of principal cells' activation functions based on a mac-level familiarity measure, representation of multiple simultaneously active hypotheses, a novel method of time warp invariant recognition, and we report results showing learning/recognition of spatiotemporal patterns. |
format | Online Article Text |
id | pubmed-4266026 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2014 |
publisher | Frontiers Media S.A. |
record_format | MEDLINE/PubMed |
spelling | pubmed-42660262015-01-06 Sparsey™: event recognition via deep hierarchical sparse distributed codes Rinkus, Gerard J. Front Comput Neurosci Neuroscience The visual cortex's hierarchical, multi-level organization is captured in many biologically inspired computational vision models, the general idea being that progressively larger scale (spatially/temporally) and more complex visual features are represented in progressively higher areas. However, most earlier models use localist representations (codes) in each representational field (which we equate with the cortical macrocolumn, “mac”), at each level. In localism, each represented feature/concept/event (hereinafter “item”) is coded by a single unit. The model we describe, Sparsey, is hierarchical as well but crucially, it uses sparse distributed coding (SDC) in every mac in all levels. In SDC, each represented item is coded by a small subset of the mac's units. The SDCs of different items can overlap and the size of overlap between items can be used to represent their similarity. The difference between localism and SDC is crucial because SDC allows the two essential operations of associative memory, storing a new item and retrieving the best-matching stored item, to be done in fixed time for the life of the model. Since the model's core algorithm, which does both storage and retrieval (inference), makes a single pass over all macs on each time step, the overall model's storage/retrieval operation is also fixed-time, a criterion we consider essential for scalability to the huge (“Big Data”) problems. A 2010 paper described a nonhierarchical version of this model in the context of purely spatial pattern processing. Here, we elaborate a fully hierarchical model (arbitrary numbers of levels and macs per level), describing novel model principles like progressive critical periods, dynamic modulation of principal cells' activation functions based on a mac-level familiarity measure, representation of multiple simultaneously active hypotheses, a novel method of time warp invariant recognition, and we report results showing learning/recognition of spatiotemporal patterns. Frontiers Media S.A. 2014-12-15 /pmc/articles/PMC4266026/ /pubmed/25566046 http://dx.doi.org/10.3389/fncom.2014.00160 Text en Copyright © 2014 Rinkus. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms. |
spellingShingle | Neuroscience Rinkus, Gerard J. Sparsey™: event recognition via deep hierarchical sparse distributed codes |
title | Sparsey™: event recognition via deep hierarchical sparse distributed codes |
title_full | Sparsey™: event recognition via deep hierarchical sparse distributed codes |
title_fullStr | Sparsey™: event recognition via deep hierarchical sparse distributed codes |
title_full_unstemmed | Sparsey™: event recognition via deep hierarchical sparse distributed codes |
title_short | Sparsey™: event recognition via deep hierarchical sparse distributed codes |
title_sort | sparsey™: event recognition via deep hierarchical sparse distributed codes |
topic | Neuroscience |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4266026/ https://www.ncbi.nlm.nih.gov/pubmed/25566046 http://dx.doi.org/10.3389/fncom.2014.00160 |
work_keys_str_mv | AT rinkusgerardj sparseyeventrecognitionviadeephierarchicalsparsedistributedcodes |